[ExI] More thoughts on sentient computers

William Flynn Wallace foozler83 at gmail.com
Thu Feb 23 19:29:46 UTC 2023


Thanks, Ben - another question:  why do we, or they, or somebody, think
that an AI has to be conscious to solve the problems we have?  Our
unconscious mind solves most of our problems now, doesn't it?  I think it
does.  bill w

On Thu, Feb 23, 2023 at 12:24 PM Ben Zaiboc via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> bill w asked:
>
>  >Three silly questions: how would you know if you had created a
> conscious mind? Why do you want to do that? What makes that necessary?
> bill w
>
>
> I like silly questions! (some of them anyway)
>
> 1) How would you know?
> Probably you would never know for sure, just as you don't know for sure
> that I'm a conscious mind. But I'd say we'd use the same criteria as we
> do with each other, or for the existence/non-existence of gods, so while
> we never absolutely know for sure, we can make a damned good guess,
> based on the evidence at our disposal.
>
> 2) Why do it?
> Because we're transhumanists, and want the sum total of self-awareness
> and intelligence in the universe to increase. Because we recognise the
> severe limitations of biological life, and if we can create artificial
> minds, we can overcome these limitations. Because we know that humans
> have a limited lifespan, both as individuals and as a species, and this
> is a way of going way beyond that.
>
> 3) What makes it necessary?
> Well, that depends on your priorities. People who think that humanity is
> a stain on the world and things would be better without it, probably
> think it's not only not necessary, but undesirable. I think it's
> necessary because we are tragically weak, fragile and confused, and
> anything we can do to correct or side-step that is a good thing.
> Artificial minds are our chance to pass down our most significant
> quality to the future, in a form that has a chance of surviving and
> thriving in the long-term (very long-term, as in billions of years and
> more).
>
> Oh, and it may be the only realistic way to achieve mind uploading. We
> probably aren't capable of figuring it out, or at least of actually
> doing it, by ourselves.
>
> And it may be the only way we're going to get out of the many pickles
> we're getting ourselves into, too. Maybe we need a non-human perspective
> to solve the various seemingly unsolvable problems we've got. I don't
> need to make a list, I'm sure you can think of plenty.
>
> Ben
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230223/cf8d589b/attachment.htm>


More information about the extropy-chat mailing list