[ExI] Zombie glutamate

Tomaz Kristan protokol2020 at gmail.com
Sun Feb 15 09:26:42 UTC 2015


I am very sure, that you can have a Turing machine in any form you want.
>From Lego, to electronics, to old steam locomotive on a long rails picking
up stones and laying them down in a Turing machine fashion ... as long as
this machine performs a good enough simulation of my current brain, I will
feel as I am writing this.



On Sun, Feb 15, 2015 at 8:23 AM, Rafal Smigrodzki <
rafal.smigrodzki at gmail.com> wrote:

>
>
> On Sun, Feb 15, 2015 at 12:42 AM, Stathis Papaioannou <stathisp at gmail.com>
> wrote:
>>
>>
>> Yes, in theory there could be a system that interprets redness but does
>> not experience redness. But if the system did experience redness and a part
>> of it was changed for a functional isomorph then it would still claim to
>> experience redness and actually experience redness. The example I gave
>> before was a physically different but chemically identical form of
>> glutamate. It's an experiment that we could actually do today. What do you
>> expect would happen? How would you interpret the results?
>>
>
> ### The problem with suggesting that qualia are determined by the exact
> physical structure of the entity experiencing them, rather than functional
> isomorphism, is that you can't justifiably stop at some point on the scale
> of "exact". You suggested a thought experiment with substituting glutamate
> but one can go on: What if qualia perceived using yesterday's brain (under
> slightly different gravitational, electromagnetic and chemical influences)
> are substantially different from today's? Of course, since our memories of
> yesterday's qualia are retrieved using today's brain, we wouldn't know. All
> of us, both conscious and philosophically zombiefied, would make the same
> mouth noises. What if the movement of Jupiter, well-known as the bringer of
> jollity, makes our qualia surreptitiously dance a merry gig?
>
> This said, I don't think that functional isomorphism can be defined
> strictly behaviorally - you need to also observe the internal processing of
> information in the system, and not just its interactions with the
> environment to define function.
>
> It remains a question whether, if you use a different processing algorithm
> to compute the same properties of experienced object, you may have
> qualitatively different experiences.
>
> Reflectances (i.e. color) in a visual input can be calculated by a cortex
> or a robotic visual system, and could trigger the same behavior - correctly
> naming colors in pictures. Would the corresponding qualia be different? My
> guess is yes, they would, much like the smell and the look of a skunk can
> trigger the same verbal output but do differ enormously on the subjective
> level.
>
> Once we are able to connect a robotic color discriminator directly to your
> brain, while keeping the old visual cortex around, we will be able to
> confirm that the qualia differ, although in what particular way would
> remain most likely ineffable.
>
> Rafał
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>


-- 
https://protokol2020.wordpress.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20150215/07400ba5/attachment.html>


More information about the extropy-chat mailing list