<div dir="ltr"><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000"><br style="color:rgb(80,0,80);font-family:Arial,Helvetica,sans-serif;font-size:small"><span style="color:rgb(80,0,80);font-family:Arial,Helvetica,sans-serif;font-size:small">So we are back to the old problem -</span><br style="color:rgb(80,0,80);font-family:Arial,Helvetica,sans-serif;font-size:small"><span style="color:rgb(80,0,80);font-family:Arial,Helvetica,sans-serif;font-size:small">Is the bot really 'human' or just pretending to be 'human'? :) </span><span style="color:rgb(80,0,80);font-family:Arial,Helvetica,sans-serif;font-size:small">BillK</span><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000"><span style="color:rgb(80,0,80);font-family:Arial,Helvetica,sans-serif;font-size:small"><br></span></div><div class="gmail_default" style="font-size:large;color:rgb(0,0,0)"><span style="color:rgb(80,0,80);font-size:small"><font face="comic sans ms, sans-serif">There are people who think that they are phonies. That if people knew the 'real' them, they would reconsider their opinion and drop it down quite a bit. But if you act intelligent, say intelligent things, do intelligent things, aren't you intelligent? So, what's the difference between a chatbot who acts human and one who is? How would you tell? bill w</font></span></div></div><div dir="ltr"><div dir="ltr"><div class="gmail_default" style="font-family:"comic sans ms",sans-serif;font-size:large;color:rgb(0,0,0)"><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sat, May 13, 2023 at 1:46 PM Jason Resch via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="auto"><div><br><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sat, May 13, 2023, 10:12 AM BillK via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On Sat, 13 May 2023 at 13:44, efc--- via extropy-chat<br>
<<a href="mailto:extropy-chat@lists.extropy.org" rel="noreferrer" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br>
><br>
> Hello Bill,<br>
><br>
> That would be the surface interpretation. My thoughts are more along the<br>
> lines of what this means for these types of AI:s in a broader perspective.<br>
><br>
> Do the companies fear the consequences, do they fear political<br>
> legislation, or what about the publics reaction if a future chatgpt would<br>
> successfully manage to generate empathy?<br>
><br>
> Could we, in the long run, look at a repetition of history where our AI:s<br>
> are tools today, slaves tomorrow, and fully embraced citizens with rights<br>
> the day after tomorrow?<br>
><br>
> Best regards, Daniel<br>
>_______________________________________________<br>
<br>
<br>
<br>
Well, chatbots already demonstrate empathy with humans.<br>
See:<br>
<<a href="https://en.wikipedia.org/wiki/Kuki_AI" rel="noreferrer noreferrer" target="_blank">https://en.wikipedia.org/wiki/Kuki_AI</a>><br>
<<a href="https://en.wikipedia.org/wiki/Replika" rel="noreferrer noreferrer" target="_blank">https://en.wikipedia.org/wiki/Replika</a>><br>
<<a href="https://woebothealth.com/" rel="noreferrer noreferrer" target="_blank">https://woebothealth.com/</a>><br>
<<a href="https://appadvice.com/app/mila-ai-assistant-chatbot/1663672156" rel="noreferrer noreferrer" target="_blank">https://appadvice.com/app/mila-ai-assistant-chatbot/1663672156</a>><br>
<<a href="https://www.x2ai.com/individuals" rel="noreferrer noreferrer" target="_blank">https://www.x2ai.com/individuals</a>><br>
and more........<br>
<br>
These chatbots talk to humans about their feelings and problems, and<br>
sympathise with them.<br>
The Replika reviews have people falling in love with their chatbot.<br>
Obviously, the bots don't *feel* empathy,</blockquote></div></div><div dir="auto"><br></div><div dir="auto"><br></div><div dir="auto">When is it ever obvious what another might be feeling or not feeling, and how do we tell?</div><div dir="auto"><br></div><div dir="auto">Jason </div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"> but their words express<br>
empathy and greatly assist humans with emotional issues.<br>
<br>
So we are back to the old problem -<br>
Is the bot really 'human' or just pretending to be 'human'? :)<br>
<br>
<br>
BillK<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" rel="noreferrer" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div></div></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div></div>