[ExI] AI is racist, bigoted and misogynistic
sparge at gmail.com
Thu Sep 23 18:03:08 UTC 2021
On Wed, Sep 22, 2021 at 7:38 PM William Flynn Wallace via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> This is where I begin to have problems. Emotions start in the limbic
> system where hormones start to flow into the bloodstream, going to various
> organs, such as the heart and lungs. An emotion implies motion: the heart
> goes faster for example. We literally feel emotions: there are no
> strictly cognitive emotions. So are AIs going to be equipped with various
> digital organs which express the emotion?
Emotions don't require hardware organs and actual hormone molecules: those
things can be simulated in software.
> What would count for dopamine in an AI?
Code that implements the behavior of dopamine.
What would be the purpose of programming emotions into AIs anyway?
If you want human-like behavior, you'll need emotions. In general, you
wouldn't want AIs to have emotions because you wouldn't want to deal with
offending them, dealing with their moods, etc. And you certainly wouldn't
want them to be ambitious and independent, lest we considered impediments
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat