<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">The split-brain phenomenon is illustrative of this point. Due to epilepsy some people have had their brain split in half to prevent each hemisphere from communicating with the other. However, each hemisphere is able to function independently from the other which implies that each hemisphere produces a separate consciousness in these people. "In a particularly dramatic recorded demonstration, the famous patient
“Joe” was able to draw a cowboy hat with his left hand in response to
the word ‘Texas' presented in his left visual half field. His commentary
(produced by the verbal left hemisphere) showed a complete absence of
insight into why his left hand had drawn this cowboy hat. Another
astonishing example involved the same patient. MacKay and MacKay (<a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7305066/#CR66" rid="CR66" class=" bibr popnode" role="button" aria-expanded="false" aria-haspopup="true">1982</a>)
flashed a digit in the left visual field and trained the patient to
play a version of ‘20 questions’ across hemispheres. The left hemisphere
guessed the answer vocally, and the right hemisphere provided responses
by pointing ‘up’ (meaning ‘guess a higher number’) or ‘down’ with the
left hand. In this way the patient managed to vocalize the right answer.
This suggests two independent conscious agents communicating with each
other (one steering the left hand, the other agent controlling vocal
expressions).”<div class=""><br class=""></div><div class=""><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7305066/" class="">https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7305066/</a><br class=""><div class=""><br class=""></div><div class=""><br class=""><div><blockquote type="cite" class=""><div class="">On Feb 24, 2023, at 6:18 AM, Jason Resch via extropy-chat <extropy-chat@lists.extropy.org> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="auto" class=""><div class=""><br class=""><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Feb 23, 2023, 2:31 PM William Flynn Wallace via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr" class=""><div class="gmail_default" style="font-family: "comic sans ms", sans-serif; font-size: large;"><br class=""></div><div class="gmail_default" style="font-family: "comic sans ms", sans-serif; font-size: large;">Thanks, Ben - another question: why do we, or they, or somebody, think that an AI has to be conscious to solve the problems we have? Our unconscious mind solves most of our problems now, doesn't it? I think it does. bill w</div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"></blockquote></blockquote></div></div><div dir="auto" class=""><br class=""></div><div dir="auto" class=""><br class=""></div><div dir="auto" class="">Why do we assume our "unconscious mind" is unconscious, rather than another mind whose consciousness we don't have access to?</div><div dir="auto" class=""><br class=""></div><div dir="auto" class="">Jason </div><div dir="auto" class=""><br class=""></div><div dir="auto" class=""><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br class=""></blockquote><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br class=""></blockquote><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br class=""></blockquote>.<br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Feb 23, 2023 at 12:24 PM Ben Zaiboc via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer" class="">extropy-chat@lists.extropy.org</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">bill w asked:<br class="">
<br class="">
>Three silly questions: how would you know if you had created a <br class="">
conscious mind? Why do you want to do that? What makes that necessary? <br class="">
bill w<br class="">
<br class="">
<br class="">
I like silly questions! (some of them anyway)<br class="">
<br class="">
1) How would you know?<br class="">
Probably you would never know for sure, just as you don't know for sure <br class="">
that I'm a conscious mind. But I'd say we'd use the same criteria as we <br class="">
do with each other, or for the existence/non-existence of gods, so while <br class="">
we never absolutely know for sure, we can make a damned good guess, <br class="">
based on the evidence at our disposal.<br class="">
<br class="">
2) Why do it?<br class="">
Because we're transhumanists, and want the sum total of self-awareness <br class="">
and intelligence in the universe to increase. Because we recognise the <br class="">
severe limitations of biological life, and if we can create artificial <br class="">
minds, we can overcome these limitations. Because we know that humans <br class="">
have a limited lifespan, both as individuals and as a species, and this <br class="">
is a way of going way beyond that.<br class="">
<br class="">
3) What makes it necessary?<br class="">
Well, that depends on your priorities. People who think that humanity is <br class="">
a stain on the world and things would be better without it, probably <br class="">
think it's not only not necessary, but undesirable. I think it's <br class="">
necessary because we are tragically weak, fragile and confused, and <br class="">
anything we can do to correct or side-step that is a good thing. <br class="">
Artificial minds are our chance to pass down our most significant <br class="">
quality to the future, in a form that has a chance of surviving and <br class="">
thriving in the long-term (very long-term, as in billions of years and <br class="">
more).<br class="">
<br class="">
Oh, and it may be the only realistic way to achieve mind uploading. We <br class="">
probably aren't capable of figuring it out, or at least of actually <br class="">
doing it, by ourselves.<br class="">
<br class="">
And it may be the only way we're going to get out of the many pickles <br class="">
we're getting ourselves into, too. Maybe we need a non-human perspective <br class="">
to solve the various seemingly unsolvable problems we've got. I don't <br class="">
need to make a list, I'm sure you can think of plenty.<br class="">
<br class="">
Ben<br class="">
_______________________________________________<br class="">
extropy-chat mailing list<br class="">
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer" class="">extropy-chat@lists.extropy.org</a><br class="">
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank" class="">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br class="">
</blockquote></div>
_______________________________________________<br class="">
extropy-chat mailing list<br class="">
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer" class="">extropy-chat@lists.extropy.org</a><br class="">
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank" class="">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br class="">
</blockquote></div></div></div>
_______________________________________________<br class="">extropy-chat mailing list<br class="">extropy-chat@lists.extropy.org<br class="">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat<br class=""></div></blockquote></div><br class=""></div></div></body></html>