<div dir="ltr"><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000">These two words are used in ambiguous ways: sympathy; empathy</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000"><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000">Sympathy - have you ever seen one of the old (1700s) cellos where there are extra strings that are not bowed or plucked? What they do is to vibrate when the other strings make certain sounds - in sympathy. So this word, to me, says that if a person feels sympathy then he feels the same as the person he is observing. On my grammar school yard a boy tossed his cookies, and shortly thereafter another boy then another. Watching it made them feel exactly the same - nauseous. Watching a person cry and sob can make you very sad too.</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000"><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000">Empathy is when you care about another person's feelings and may have had them yourself before, but are not literally feeling what they feel.</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000"><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000">I think that one or the other of these has to be present, or we could not enjoy sports, novels or anything that has characters or teams that we identify with. On the other hand, presumably we enjoy, though that might not be the right word, horror stories and movies, family tragedy stories, or in fact anything that elicits emotions in people. A gigantic online industry is built on eliciting emotions: sexual arousal.</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000"><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:small;color:#000000">I don't know how you define these words, or the word compassion, but I think compassion might be either sympathy or empathy. bill w</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Jun 25, 2019 at 7:48 AM BillK <<a href="mailto:pharos@gmail.com">pharos@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On Tue, 25 Jun 2019 at 12:02, Mike Dougherty <<a href="mailto:msd001@gmail.com" target="_blank">msd001@gmail.com</a>> wrote:<br>
><br>
> On Tue, Jun 25, 2019, 6:11 AM Rafal Smigrodzki <<a href="mailto:rafal.smigrodzki@gmail.com" target="_blank">rafal.smigrodzki@gmail.com</a>> wrote:<br>
>> ### The latter is closer to being true, of course, - if you don't have a lot of money or other useful resources, any goody-goody feelings you may have don't matter. A bit like "When a tree falls in the forest and nobody hears it, it doesn't matter if it makes a sound".<br>
>><br>
>><br>
><br>
> What this thread has proven to me is that some of the smartest people I know do not understand compassion, so the likelihood that "ai emotions" will be any better is small (assuming ai will be "learning" such things from other "smart people")<br>
><br>
> The word is rooted in "co-suffering" and is about experiencing someone's situation as one's own. Simply sharing "I understand" could be an act of compassion. Our world is so lacking in this feature that even the commonly understood meaning of the word has been lost. That seems very Newspeak to me. (smh)<br>
> _______________________________________________<br>
<br>
<br>
That is a well-known problem for AI. If humans are biased, then what<br>
they build will have the same biases built in. They may try to avoid<br>
some of their known biases, like racial prejudice for example, but all<br>
their unconscious biases will still be there. Even the training data<br>
fed to AIs will be biased.<br>
On the other hand, humans don't want cold merciless AI intelligences.<br>
They want AIs to be biased to look after humans.<br>
But not *all* humans, of course.<br>
Humans can feel compassion for people that they are bombing to<br>
destruction, but still feel duty bound to continue for many reasons.<br>
Human society is a confused mess and it is likely that if AIs become<br>
autonomous they will also follow that path.<br>
<br>
<br>
BillK<br>
<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>