[ExI] ai emotions

Rafal Smigrodzki rafal.smigrodzki at gmail.com
Tue Jun 25 16:19:55 UTC 2019

On Tue, Jun 25, 2019 at 6:57 AM Mike Dougherty <msd001 at gmail.com> wrote:

> On Tue, Jun 25, 2019, 6:11 AM Rafal Smigrodzki <rafal.smigrodzki at gmail.com>
> wrote:
>>> ### The latter is closer to being true, of course, - if you don't have a
>> lot of money or other useful resources, any goody-goody feelings you may
>> have don't matter. A bit like "When a tree falls in the forest and nobody
>> hears it, it doesn't matter if it makes a sound".
> What this thread has proven to me is that some of the smartest people I
> know do not understand compassion, so the likelihood that "ai emotions"
> will be any better is small (assuming ai will be "learning" such things
> from other "smart people")
> The word is rooted in "co-suffering" and is about experiencing someone's
> situation as one's own.   Simply sharing "I understand" could be an act of
> compassion.   Our world is so lacking in this feature that even the
> commonly understood meaning of the word has been lost. That seems very
> Newspeak to me.  (smh)

### Compassion not supported by cold hard cash is a cheap thing to offer.
Maybe that's why there is so much compassion making the rounds, without a
corresponding amount of problem-solving.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20190625/b6cbd5af/attachment.htm>

More information about the extropy-chat mailing list