[ExI] The digital nature of brains (was: digital simulations)

Stathis Papaioannou stathisp at gmail.com
Mon Jan 25 15:19:59 UTC 2010


On 26 January 2010 00:11, Gordon Swobe <gts_2000 at yahoo.com> wrote:

> It certainly is a problem once you understand the syntax-semantics problem. You just don't take it seriously or don't understand it.

You are saying that in addition to the symbol grounding problem there
the problem of attaching "meaning" to the symbols. You can't explain
what this meaning is but you feel that humans have it and computers
don't. No empirical test can ever convince you that computers have it,
because by definition there is no empirical test for it. Apparently no
analytic argument can convince you either.

> Do you believe your desktop or laptop computer has conscious understanding of the words you type?

No. I don't believe animals understand everything humans do either,
even though mammalian brains are structurally all very similar. But if
the computer was able to have a convincing conversation with me (not a
trick like ELIZA) then then I would have to consider that it may
understand the words. Furthermore, if the computer was based on
reverse engineering a human brain then I would say it has to have the
same consciousness as a human. As I have explained several times, I am
led to the latter conclusion from the absurdity that results from
assuming it false, rather in the way you can prove that sqrt(2) is
irrational by assuming that it is rational and showing that the
assumption leads to a contradiction. It's frustrating that you
probably understand this but choose simply to dismiss it.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list