[ExI] The digital nature of brains (was: digital simulations)

Eric Messick eric at m056832107.syzygy.com
Sun Jan 31 18:29:26 UTC 2010


Gordon:
>Eric:
>> So, can you tell the difference?
>
>I can't know the difference from their external behavior, but I can
> know it from a bit of surgery + some philosophical arguments.

In other words, if you were to directly experience something which
contradicts your philosophical arguments, you would believe the
philosophy over the reality.

I guess heavy objects must fall faster than light ones.  Let's not
bother to do the experiment.

>> Or do you claim that it will always be impossible to create
>> such a simulation in the first place?  No, wait, you've
>> already said that systems that pass the Turing Test will be possible,
>> so you're no longer claiming that it is impossible.  Do you want to
>> change your mind on that again?
>
>Excuse me? I never argued for the impossibility of such systems and I
> have not "changed my mind" about this. I wonder now if I can count on
> you for an honest discussion.

Going through old messages, the first I found that fit my memory of
this was:

>Message-ID: <845939.46868.qm at web36506.mail.mud.yahoo.com>
>Date: Mon, 28 Dec 2009 04:47:32 -0800 (PST)
>From: Gordon Swobe <gts_2000 at yahoo.com>
>To: ExI chat list <extropy-chat at lists.extropy.org>
>Subject: Re: [ExI] The symbol grounding problem in strong AI
>
>--- On Sun, 12/27/09, Stathis Papaioannou <stathisp at gmail.com> wrote:
>[...]
>> If the replacement neurons behave normally in their
>> interactions with the remaining brain, then the subject *must* 
>> behave normally. 
>
>But your replacement neurons *won't* behave normally, and so your
>possible conclusions don't follow.
>[...]

This was the start of a series of posts where you said that someone
with a brain that had been partially replaced with programmatic
neurons would behave as though he was at least partially not
conscious.  You claimed that the surgeon would have to replace more
and more of the brain until he behaved as though he was conscious, but
had been zombified by extensive replacement.

You were, in essence, claiming that it was impossible to create
programmatic neurons which would have the same behavior as biological
neurons.

I wonder now if I can count on you for an honest discussion.

-eric



More information about the extropy-chat mailing list