[ExI] Meaningless Symbols.

Stathis Papaioannou stathisp at gmail.com
Sun Jan 17 09:19:13 UTC 2010


2010/1/17 Eric Messick <eric at m056832107.syzygy.com>:
> Gordon writes:
>>--- On Sat, 1/16/10, Eric Messick <eric at m056832107.syzygy.com> wrote:
>>> Would you say that "description != thing" is the reason
>>> computer systems cannot replicate the capability of brains to
>>> understand?
>>
>>In a general sense, yes. I think S/H systems can simulate
>> understanding as in weak AI but that they cannot have conscious
>> understanding as in strong AI.
>
> Putting these together, what we're discussing here is the truth value
> of the statement: "simulated understanding != real understanding", and
> what you might consider a corollary: "simulated consciousness != real
> consciousness".
>
> You seem to take these statements as axioms.
>
> I think that both consciousness and understanding are computational
> processes.  Turing showed that beyond a very simple set of
> capabilities, all computational processes can be considered equivalent
> (modulo obvious performance differences).  If both these things are
> true, then simulated consciousness must equal real consciousness, in
> violation of your axiom.
>
> Unless you want to argue with Turing, we can construct the following
> statement:
>
>  If simulated consciousness is not equivalent to real consciousness,
>  then consciousness is not a computational process.
>
> Do you agree that this is a true statement?

I am guessing that Gordn's answer will be an emphatic "yes".

> Perhaps instead, you've taken the consequent part as your axiom, and
> derived your thoughts about simulated equivalence.
>
> In any case, it appears we have someone taking A as an axiom talking
> with someone taking ~A as an axiom.  Not easy to resolve.

I've tried to show with the brain replacement thought experiment what
simulated understanding done properly would be like. It would mean
that if your understanding of language, or your perceptions, or
emotions, or any other important aspect of consciousness were suddenly
zombified your behaviour would remain unchanged and you would not
notice that anything odd had happened. If there is no subjective or
objective difference between simulated (zombie) understanding and real
understanding, then it seems absurd to insist that they are different
things. I believe that Gordon can see this also, which is why he
initially said that the experiment was too ridiculous to contemplate
and now that the zombie implant won't really be a zombie implant.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list