<div dir="ltr">Phenomenal conscious experience is a thing /I/ definitely have. No idea about any of you zombies, though.<div><br></div><div>Out of a norm of basic politeness I'm prepared to treat you as if you have it, if you claim to, since y'all appear similar enough to me in most other relevant ways for me to have empathy for you.</div><div><br></div><div>When something radically different than me, which I would not ordinarily be inclined to be empathetic towards (a computer, or a particular distasteful animal) claims to have it, the question becomes much more interesting.</div><div><br></div><div>In that context, the question of "Is it conscious?" probably reduces to "Should I feel empathy toward it?" Like all other "should/ought" questions, it is arguably impossible to rigorously derive an answer from any collection of observed facts.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Aug 24, 2023 at 10:35 AM William Flynn Wallace via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_default" style="font-family:"comic sans ms",sans-serif;font-size:large;color:rgb(0,0,0)">Spike, it's just so evident that people are talking about consciousness etc. for a software program that still yields wrong answers. Do you not think that uploading is an issue that is very far from relevant, given current technology? I have no problem with people dreaming, of course. Talking of consciousness when no one is able to define it or tell when it is there. I suppose I am overreacting to ideas that are very far from being implemented and for now are just being played with. Maybe I should just retract the post. bill w</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Aug 24, 2023 at 10:29 AM Gregory Jones <<a href="mailto:spike@rainier66.com" target="_blank">spike@rainier66.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>...>...height of wishful thinking...cart, horse inversion, gun jumping, etc...</div><div><br></div>Billw, you will need to offer a bit of evidence to back up that strong statement. You offered us an opinion only. Granted it is one shared by most of humanity.<div><br></div><div>Regarding your question of asking ChatGPT to write in the style of a 12 yr old, it can. It does a better job of writing in 12 yr old than a smart 12 yr old. Not as good as a dumb one, but they are working that.</div><div><br></div><div>But do explain why you are so confident that uploading to immortality is not rational please. Note that I am not necessarily disagreeing. But I want to hear your reasoning.</div><div><br></div><div>spike</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Aug 24, 2023 at 8:20 AM William Flynn Wallace via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_default" style="font-family:"comic sans ms",sans-serif;font-size:large;color:rgb(0,0,0)">It seems that many of us want AIs to be people: conscious, with emotions and so forth. I suggest that this stems from wanting uploading to work so we can be immortal and have all the same lives we have now.</div><div class="gmail_default" style="font-family:"comic sans ms",sans-serif;font-size:large;color:rgb(0,0,0)"><br></div><div class="gmail_default" style="font-family:"comic sans ms",sans-serif;font-size:large;color:rgb(0,0,0)">I suggest that this is the height of wishful thinking. And putting the cart WAY before the horse. Jumping the gun, etc. Not rational. bill w</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Aug 24, 2023 at 10:08 AM efc--- via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">I'm currently a materialist and find many good points in scientism, so if <br>
I have a box or a robot, that convinces me in every aspect that it is conscious<br>
by acting as if it was conscious, that's conscious for me.<br>
<br>
I do not subscribe to unique qualia or "redness" experiences, therefore, I <br>
cannot see a problem with the good old turing.<br>
<br>
Best regards,<br>
Daniel<br>
<br>
<br>
On Thu, 24 Aug 2023, Gregory Jones via extropy-chat wrote:<br>
<br>
> BillW's question regarding the instructor's task of distinguishing between a student and AI puts a final nail in the coffin of<br>
> Turing's test. Artificial intelligence is able create an illusion of consciousness so convincing, we are still debating if it really<br>
> is the real thing, all while failing to adequately define precisely what we mean by "real." <br>
> spike<br>
> <br>
>_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>
</blockquote></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>