[ExI] another open letter, but this one may be smarter than the previous one

Keith Henson hkeithhenson at gmail.com
Sun Apr 30 17:37:26 UTC 2023

On Sat, Apr 29, 2023 at 1:57 PM Will Steinberg via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
> On Sat, Apr 29, 2023 at 11:52 AM Jason Resch via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>> What a terrifying time to be alive.  I don't see a plausible scenario where this all doesn't lead to unbelievable amounts of suffering (of both biological and machine consciousness.)
>> Yes, if we aren't careful and if we don't treat these early systems with respect, something like this nightmare scenario could easily happen: https://qntm.org/mmacevedo
> Ha, if we manage to even get there I'll be surprised.   Sorry to be such a doomer.  It is just scary to see people playing around with such grave ontological concepts as if they're just the next killer app.  Unbridled technocratic competition is going to kill us all

I think you are being unjustifiably pessimistic.  I don't see any
reason to be either optimistic or pessimistic.  The very nature of a
(or the) singularity is that we don't and can't know what is on the
other side.


More information about the extropy-chat mailing list