[ExI] Do digital computers feel?

Jason Resch jasonresch at gmail.com
Fri Dec 30 19:09:11 UTC 2016


I came across this video today, and thought it relevant to this
conversation:

https://www.youtube.com/watch?v=nQHBAdShgYI

Jason

On Fri, Dec 30, 2016 at 9:01 AM, Jason Resch <jasonresch at gmail.com> wrote:

>
>
> On Fri, Dec 30, 2016 at 1:14 AM, Rafal Smigrodzki <
> rafal.smigrodzki at gmail.com> wrote:
>
>>
>>
>> On Tue, Dec 27, 2016 at 12:03 PM, Jason Resch <jasonresch at gmail.com>
>> wrote:
>>
>>>
>>> If infinities are relevant to mental states, they must be irrelevant to
>>> any external behavior that can be tested in any way. This is because the
>>> holographic principal places discrete and finite bounds on the amount of
>>> information that can be stored in a an area of space of finite volume. Even
>>> if there is infinite information in your head, no physical process
>>> (including you) can access it.
>>>
>>
>> ### Indeed, this is a valuable insight. But you could still have
>> qualitative but inaccessible (to other observers) differences between the
>> mental states realized on finite machines vs. ones implemented in
>> (putatively) infinite physics.
>>
>> ---------------------------------------------------
>>
>
> What would be accessing this information and having these perceptions
> then? It seems to me you would need some "raw perceiver" which itself is
> divorced entirely from the physical universe. Can there be perceptions that
> in theory can have no effect on behavior whatsoever? Not even in detectable
> differences in neuronal behavior or positions of particles in the brain?
>
>
>>
>>
>>>>
>>> This is analogy is somewhat backwards, in my opinion.
>>>
>>> It's not that the brain works like a computer, it's that computers can
>>> perfectly mimic any finite process. They are "universal machines" in the
>>> same sense of a universal remote, or in that a speaker system can function
>>> as a "universal instrument".
>>>
>>> Therefore, if the brain is a machine, and is finite, then an
>>> appropriately programmed computer can perfectly emulate any of its
>>> behaviors. Philosophers generally fall into one os three camps, on the
>>> question of consciousness and the computational theory of mind:
>>>
>>> *Non-computable physicists - *Believe human thought involves physical
>>> processes that are non-computable, and therefore conclude that it’s
>>> impossible to replicate the behavior of a human brain using a computer.
>>>
>>> *Weak AI proponents - * Believe the behavior of the human brain can be
>>> replicated by computer, but assume such a reproduction, no matter how good,
>>> would not possess a mind or consciousness.
>>>
>>> *Computationalists - *Believe the behavior of the human brain can be
>>> replicated by a computer, and assume that when the reproduction is
>>> sufficiently faithful, it possesses a mind and conscious.
>>>
>>>
>>> Which camp do you consider yourself in?
>>>
>>
>> ### I have always considered myself a computationalist but recently
>> thinking about the identity of indiscernibles as applied to finite
>> mathematical objects simulating mental processes I became confused. I think
>> I am still a computationalist but a mildly uneasy one. At least, if
>> digitally simulated human minds are P-zombies, it won't hurt to be one, so
>> I still intend to get uploaded ASAP.
>>
>>
> What does your unease come from? Is it the uncertainty over whether or not
> the brain is infinite or finite? I think even if it is finite there is
> reason to be uneasy over uploading, the question of whether the functional
> substitution captures the necessary level. The concept of a substitution
> level is defined and explored in this paper: http://iridia.ulb.ac.
> be/~marchal/publications/CiE2007/SIENA.pdf
>
> I think the matter of the substitution level and the importance of it is
> what Ned Block captured in his Blockhead thought experiment (
> https://en.wikipedia.org/wiki/Blockhead_(computer_system) ), where his
> brain was replaced with a lookup table. This can replicate external
> behaviors, but it is an entirely different function from one that actually
> implements his mind, and thus it may be a zombie or zombie-like.
>
> Jason
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20161230/b8155fd3/attachment.html>


More information about the extropy-chat mailing list