[ExI] ai emotions

Mike Dougherty msd001 at gmail.com
Tue Jun 25 10:56:58 UTC 2019


On Tue, Jun 25, 2019, 6:11 AM Rafal Smigrodzki <rafal.smigrodzki at gmail.com>
wrote:

>
>
>> ### The latter is closer to being true, of course, - if you don't have a
> lot of money or other useful resources, any goody-goody feelings you may
> have don't matter. A bit like "When a tree falls in the forest and nobody
> hears it, it doesn't matter if it makes a sound".
>
>
>
What this thread has proven to me is that some of the smartest people I
know do not understand compassion, so the likelihood that "ai emotions"
will be any better is small (assuming ai will be "learning" such things
from other "smart people")

The word is rooted in "co-suffering" and is about experiencing someone's
situation as one's own.   Simply sharing "I understand" could be an act of
compassion.   Our world is so lacking in this feature that even the
commonly understood meaning of the word has been lost. That seems very
Newspeak to me.  (smh)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20190625/410e1731/attachment.htm>


More information about the extropy-chat mailing list