[ExI] Hidden rules of Microsoft copilot.
William Flynn Wallace
foozler83 at gmail.com
Sat Jun 3 17:13:38 UTC 2023
So we are back to the old problem -
Is the bot really 'human' or just pretending to be 'human'? :) BillK
There are people who think that they are phonies. That if people knew the
'real' them, they would reconsider their opinion and drop it down quite a
bit. But if you act intelligent, say intelligent things, do intelligent
things, aren't you intelligent? So, what's the difference between a
chatbot who acts human and one who is? How would you tell? bill w
On Sat, May 13, 2023 at 1:46 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
>
>
> On Sat, May 13, 2023, 10:12 AM BillK via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Sat, 13 May 2023 at 13:44, efc--- via extropy-chat
>> <extropy-chat at lists.extropy.org> wrote:
>> >
>> > Hello Bill,
>> >
>> > That would be the surface interpretation. My thoughts are more along the
>> > lines of what this means for these types of AI:s in a broader
>> perspective.
>> >
>> > Do the companies fear the consequences, do they fear political
>> > legislation, or what about the publics reaction if a future chatgpt
>> would
>> > successfully manage to generate empathy?
>> >
>> > Could we, in the long run, look at a repetition of history where our
>> AI:s
>> > are tools today, slaves tomorrow, and fully embraced citizens with
>> rights
>> > the day after tomorrow?
>> >
>> > Best regards, Daniel
>> >_______________________________________________
>>
>>
>>
>> Well, chatbots already demonstrate empathy with humans.
>> See:
>> <https://en.wikipedia.org/wiki/Kuki_AI>
>> <https://en.wikipedia.org/wiki/Replika>
>> <https://woebothealth.com/>
>> <https://appadvice.com/app/mila-ai-assistant-chatbot/1663672156>
>> <https://www.x2ai.com/individuals>
>> and more........
>>
>> These chatbots talk to humans about their feelings and problems, and
>> sympathise with them.
>> The Replika reviews have people falling in love with their chatbot.
>> Obviously, the bots don't *feel* empathy,
>
>
>
> When is it ever obvious what another might be feeling or not feeling, and
> how do we tell?
>
> Jason
>
> but their words express
>> empathy and greatly assist humans with emotional issues.
>>
>> So we are back to the old problem -
>> Is the bot really 'human' or just pretending to be 'human'? :)
>>
>>
>> BillK
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230603/700fd4f9/attachment-0001.htm>
More information about the extropy-chat
mailing list