[ExI] Eliezer Yudkowsky New Interview - 20 Feb 2023
spike at rainier66.com
spike at rainier66.com
Sun Feb 26 21:55:09 UTC 2023
-----Original Message-----
From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf Of Gadersd via extropy-chat
Subject: Re: [ExI] Eliezer Yudkowsky New Interview - 20 Feb 2023
>...Yudkowsky has good reasons for his doomsaying, but I still can’t shake a gut feeling that he is overestimating the probability of AI destroying humanity. Maybe this gut feeling is off but I can’t help but be mostly optimistic...
Ja, the notion of AI destroying humanity assumes away the risk we will beat us to it. Ordinary biological intelligence may be sufficient to destroy humanity, or at least the technology upon which our survival depends.
Consider for instance two existential risks we have from a big change since 1950, when fusion bombs were developed. We have always imagined the bombs destroying cities by detonation at about the altitude a Cessna flies. Since then we have discovered that if you pop them off at the altitude a 737 typically flies, the electromagnetic pulse is sufficient to wreck electronics and communications equipment. Without that infrastructure, even temporarily, the damage to modern civilization is difficult to comprehend.
spike
More information about the extropy-chat
mailing list