[ExI] cool article by shostak

William Flynn Wallace foozler83 at gmail.com
Thu Nov 17 23:09:32 UTC 2016


If the behaviour associated with emotions can be replicated by a computer,
> the emotions should follow. Otherwise, we would be able to make a partial
> philosophical zombie, which is logically problematic:
>

​I think you have to separate internal feelings from overt behavior.
Running, for instance, could signal fear or joy.  Inferring an internal
state from an external one is highly problematic. (If I understand what you
are saying.)
 bill w​


>
>
>> It's a very fine line.  I like this analogy:  suppose a frog whose
>> ability to jump is limited to 5 inches, vertically.  Then that frog is put
>> at the bottom of a staircase in which each step is 6 inches.  Another frog
>> who can jump 6 inches can go all the way to the top.
>>
>> So what looks like a huge qualitative difference between these two frogs
>> is really a very small quantitative difference.
>>
>> Very fine line between us and apes.
>>
>> Is intelligence really just a quantitative thing, or are dozens of
>> qualitative processes there too?  Emotions can vary quantitatively but the
>> biggest feature of them is qualitative - anger is different from anxiety,
>> for example.
>>
>> I wish I knew enough about AI to understand how they are going to program
>> qualitative states into a computer.
>>
>> I wish someone knew enough about animal emotions for us to compare us to
>> them.
>>
>> It would seem that emotions are a much more fuzzy topic than
>> intelligence, but perhaps our definitions of intelligence just are too
>> limited to appreciate the nonquantitative aspects of it.
>>
>> I am not trying to define what a human is, or just how we differ from
>> lower animals.  I don't think we know enough for that yet.
>>
>> bill w
>>
>> On Wed, Nov 16, 2016 at 7:30 PM, John Clark <johnkclark at gmail.com> wrote:
>>
>>> On Wed, Nov 16, 2016 William Flynn Wallace <foozler83 at gmail.com> wrote:
>>>
>>> ​> ​
>>>> I agree with the logic of this article, but there's something
>>>> missing.Yeah - it's the rest of what it means to be human: emotions and
>>>> feelings and smells and tastes
>>>
>>>
>>> ​That's not what makes us human, other creatures on this planet have
>>> been able to feel and smell and taste for at least ​500 million years,
>>> they've behaved as if they had emotions too. It's intelligence that
>>> distinguishes us from the other animals and makes us human.
>>>
>>>
>>>> ​> ​
>>>> Would I give up those things for a higher IQ?  What do you think?
>>>>
>>>
>>> ​I see no reason you couldn't have both.​
>>>
>>>
>>>
>>>> ​> ​
>>>> If you would, you are as cold as the machines referred to in the
>>>> article.
>>>
>>>
>>> ​I think it would be easier, far easier, for us to make a emotional
>>> machine ​that a intelligent, certainly Evolution found that to be the case.
>>>
>>>
>>> ​John K Clark​
>>>
>>>
>>>
>>>
>>>
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>>>
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>>
>
>
> --
> Stathis Papaioannou
>
> http://consc.net/papers/qualia.html
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20161117/4f9374df/attachment.html>


More information about the extropy-chat mailing list