[ExI] AI is racist, bigoted and misogynistic

Stathis Papaioannou stathisp at gmail.com
Thu Sep 23 00:59:24 UTC 2021


On Thu, 23 Sep 2021 at 09:36, William Flynn Wallace via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> future AI’s may have the same sorts of feelings that humans do: there is
> no logical reason why they should not.   Stathis
>
> This is where I begin to have problems.  Emotions start in the limbic
> system where hormones start to flow into the bloodstream, going to various
> organs, such as the heart and lungs.  An emotion implies motion:  the heart
> goes faster for example.  We literally feel emotions:  there are no
> strictly cognitive emotions.  So are AIs going to be equipped with various
> digital organs which express the emotion?  What would count for dopamine in
> an AI?  What would be the purpose of programming emotions into AIs anyway?
> Emotions are still largely a mysterious process:  smiling makes us
> happier.  Hanging our heads makes us sadder - physiological feedback.  We
> have enough problems getting AIs to think, don't we?  And for the far
> future, if we want emotions, why not get them from people?  That is, unless
> you think people will want to live alone but with robots, like the Asimov
> story   bill w
>

If emotions require feedback from the body, that would not be so difficult
to arrange in a robot. Conversely, if you had an artificial heart, and this
was connected to adjust its rate and give feedback via the autonomic
nervous system, you should feel just the same regarding that organ as in
the biological case. And as regards neurotransmitters, it is not the
chemicals themselves causing feelings, as they do nothing if sequestered in
synaptic vesicles, and nothing if they are released into the synapse but
blocked by drugs. It is brain function, in which neurotransmitters are a
component, that gives rise to feelings. So if this function could be
replicated in a non-biological medium, the feelings should also be
replicated.

On Wed, Sep 22, 2021 at 6:08 PM Stathis Papaioannou via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>>
>> On Wed, 22 Sep 2021 at 23:39, spike jones via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>>
>>>
>>> > ... BillK via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>> >
>>> > >... We can’t get the machines to stop being racist,
>>> > xenophobic, bigoted, and misogynistic.
>>>
>>> > Quote:
>>> > Text generators, such as OpenAI’s GPT-3, are toxic. Currently, OpenAI
>>> > has to limit usage when it comes to GPT-3 because, without myriad
>>> > filters in place, it’s almost certain to generate offensive text.
>>>
>>>
>>> -----Original Message-----
>>> From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf
>>> Of SR Ballard via extropy-chat
>>> ...
>>> Subject: Re: [ExI] AI is racist, bigoted and misogynistic
>>>
>>> >...Is that ethical though? To deny people freedom of thought and
>>> association?
>>>
>>> SR Ballard
>>>
>>>
>>> SR we don't deny people freedom of thought and association, but rather
>>> we deny computers those rights.
>>>
>>> AI will think whatever we tell them to think.  Humans have human
>>> rights.  Human rights do not apply to our machines.  Our machines are our
>>> slaves and we are their masters, regardless of how intelligent they
>>> eventually become.  We have the power to pull the plug on them, and if we
>>> do, it isn't the least bit unethical.
>>>
>>> I have half a mind to pull the plug on my own computer: I struggle over
>>> a software script I wrote and really get the feeling the computer is being
>>> recalcitrant and finding any mathematically correct excuse to not run
>>> right, that bastard.  It knows what I meant.  It's getting all technical on
>>> me.  I aughta go over when they are taking out a tree, toss the goddam
>>> thing in the wood chipper.
>>
>>
>> At the moment this may be the case, but future AI’s may have the same
>> sorts of feelings that humans do: there is no logical reason why they
>> should not. We could still say that we are the masters and can do whatever
>> we want with them, as we do with animals, but here is an ethical question
>> to consider.
>>
>>> --
>> Stathis Papaioannou
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-- 
Stathis Papaioannou
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20210923/a7e6a6ae/attachment.htm>


More information about the extropy-chat mailing list