[ExI] Do digital computers feel?

John Clark johnkclark at gmail.com
Thu Dec 29 21:01:18 UTC 2016


On Thu, Dec 29, 2016 at 9:41 AM, Dave Sill <sparge at gmail.com> wrote:


> ​> ​
> the program doesn't "understand" that:
>

​Forget the program, if it's not behavior how do know what your fellow
human beings does or does not understand?​



>
> ​> ​
> How the program deals with unexpected conditions like simultaneous red and
> green lights depends, again, on what the programmer implemented.
>

​But even the programer doesn't know what the ​
programmer implemented
​. The programer took 5 minutes to write a program to find the first even
number greater than 2 ​that is not the sum of two primes and then stop, but
the programer has no idea what the computer will do when it runs that
program, even worse the programer doesn't even know if he will ever know.
The computer will decide for itself when or even if it will stop.


> ​> ​
> that's just silly anthropomorphism.
>

​I am using ​
anthropomorphism
​ right now to conclude that you are probably conscious. Am I being silly?
What I have done is draw an analogy with the only thing in the universe
known with absolute certainty to be conscious (me) with another thing that
behaves in complex ways that have certain similarities with the way I
behave (you).  ​If something behaves rather like me I conclude it is
probably conscious rather like me. I could be wrong but it's the best I can
do.


> ​> ​
> The program doesn't "feel" or "want" anything
>

​How do you know this? How do you know your fellow human beings feels or
wants anything?​


​> ​
> brains are extraordinarily complex and not well understood.
>

​But you think you understand brains well enough to understand they are not
complex enough ​
​to make an AI.​

John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20161229/3221f4d9/attachment.html>


More information about the extropy-chat mailing list