[ExI] The digital nature of brains (was: digital simulations)

Gordon Swobe gts_2000 at yahoo.com
Sun Jan 31 19:22:33 UTC 2010


--- On Sun, 1/31/10, Eric Messick <eric at m056832107.syzygy.com> wrote:

>> I can't know the difference from their external behavior, but I can 
>> know it from a bit of surgery + some philosophical
>> arguments.
> 
> In other words, if you were to directly experience
> something which contradicts your philosophical arguments, you would 
> believe the philosophy over the reality.

Looks to me like you want put words in my mouth and that you don't want or perhaps don't know to have a fair an honest discussion. You're losing credibility with me fast.

>> Excuse me? I never argued for the impossibility of such
>> systems and I have not "changed my mind" about this. I wonder now if
>> I can count on you for an honest discussion.
> 
> Going through old messages, the first I found that fit my
> memory of this was:
> 
> >Message-ID: <845939.46868.qm at web36506.mail.mud.yahoo.com>
> >Date: Mon, 28 Dec 2009 04:47:32 -0800 (PST)
> >From: Gordon Swobe <gts_2000 at yahoo.com>
> >To: ExI chat list <extropy-chat at lists.extropy.org>
> >Subject: Re: [ExI] The symbol grounding problem in
> strong AI
> >
> >--- On Sun, 12/27/09, Stathis Papaioannou <stathisp at gmail.com>
> wrote:
> >[...]
> >> If the replacement neurons behave normally in
> their
> >> interactions with the remaining brain, then the
> subject *must* 
> >> behave normally. 
> >
> >But your replacement neurons *won't* behave normally,
> and so your
> >possible conclusions don't follow.
> >[...]
> 
> This was the start of a series of posts where you said that
> someone with a brain that had been partially replaced with
> programmatic neurons would behave as though he was at least partially
> not conscious.  You claimed that the surgeon would have to
> replace more and more of the brain until he behaved as though he was
> conscious, but had been zombified by extensive replacement.

Right, and Stathis' subject will eventually pass the TT just as your subject will in your thought experiment. But in both cases the TT will give false positives. The subjects will have no real first-person conscious intentional states. 

-gts


      



More information about the extropy-chat mailing list