[ExI] ai emotions

William Flynn Wallace foozler83 at gmail.com
Tue Jun 25 13:19:48 UTC 2019


These two words are used in ambiguous ways:  sympathy; empathy

Sympathy - have you ever seen one of the old (1700s) cellos where there are
extra strings that are not bowed or plucked?  What they do is to vibrate
when the other strings make certain sounds - in sympathy.  So this word, to
me, says that if a person feels sympathy then he feels the same as the
person he is observing.  On my grammar school yard a boy tossed his
cookies, and shortly thereafter another boy then another.  Watching it made
them feel exactly the same - nauseous.  Watching a person cry and sob can
make you very sad too.

Empathy is when you care about another person's feelings and may have had
them yourself before, but are not literally feeling what they feel.

I think that one or the other of these has to be present, or we could not
enjoy sports, novels or anything that has characters or teams that we
identify with.  On the other hand, presumably we enjoy, though that might
not be the right word, horror stories and movies, family tragedy stories,
or in fact anything that elicits emotions in people.  A gigantic online
industry is built on eliciting emotions: sexual arousal.

I don't know how you define these words, or the word compassion, but I
think compassion might be either sympathy or empathy.  bill w

On Tue, Jun 25, 2019 at 7:48 AM BillK <pharos at gmail.com> wrote:

> On Tue, 25 Jun 2019 at 12:02, Mike Dougherty <msd001 at gmail.com> wrote:
> >
> > On Tue, Jun 25, 2019, 6:11 AM Rafal Smigrodzki <
> rafal.smigrodzki at gmail.com> wrote:
> >> ### The latter is closer to being true, of course, - if you don't have
> a lot of money or other useful resources, any goody-goody feelings you may
> have don't matter. A bit like "When a tree falls in the forest and nobody
> hears it, it doesn't matter if it makes a sound".
> >>
> >>
> >
> > What this thread has proven to me is that some of the smartest people I
> know do not understand compassion, so the likelihood that "ai emotions"
> will be any better is small (assuming ai will be "learning" such things
> from other "smart people")
> >
> > The word is rooted in "co-suffering" and is about experiencing someone's
> situation as one's own.   Simply sharing "I understand" could be an act of
> compassion.   Our world is so lacking in this feature that even the
> commonly understood meaning of the word has been lost. That seems very
> Newspeak to me.  (smh)
> > _______________________________________________
>
>
> That is a well-known problem for AI. If humans are biased, then what
> they build will have the same biases built in. They may try to avoid
> some of their known biases, like racial prejudice for example, but all
> their unconscious biases will still be there. Even the training data
> fed to AIs will be biased.
> On the other hand, humans don't want cold merciless AI intelligences.
> They want AIs to be biased to look after humans.
> But not *all* humans, of course.
> Humans can feel compassion for people that they are bombing to
> destruction, but still feel duty bound to continue for many reasons.
> Human society is a confused mess and it is likely that if AIs become
> autonomous they will also follow that path.
>
>
> BillK
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20190625/d297925a/attachment.htm>


More information about the extropy-chat mailing list