[ExI] another open letter, but this one may be smarter than the previous one

spike at rainier66.com spike at rainier66.com
Sat Apr 29 15:37:13 UTC 2023



From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf Of Will Steinberg via extropy-chat


>…What a terrifying time to be alive.  I don't see a plausible scenario where this all doesn't lead to unbelievable amounts of suffering (of both biological and machine consciousness.)


Ja, but consider the alternative.  If we fail to develop a superintelligence of some kind, we know what will happen to us as individuals, ja?  We get comfort in knowing that our progeny live on after we are gone, but we are still gone, and after another easily-foreseeable span of time, they are too. 

AI is simultaneously our biggest threat and our only hope.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230429/864b8c0f/attachment.htm>

More information about the extropy-chat mailing list