[ExI] multiple realizability
Gordon Swobe
gts_2000 at yahoo.com
Mon Feb 1 01:47:46 UTC 2010
--- On Sun, 1/31/10, Eric Messick <eric at m056832107.syzygy.com> wrote:
>> This kind of processing goes on in every
>> software/hardware system.
>
> Yes, and apparently you didn't understand me. I
> already addressed this issue later in the same message.
> It's at a different layer of abstraction.
The layer of abstraction does not matter to me. What does matter is the extent to which the system has supposed mental operations comprised of computational processes operating over formal elements, i.e., to what extent it operates by formal programs. To that extent, in my view, the system lacks a mind.
One can conceive of an "artificially" constructed neural network that is in every respect identical to a natural brain, in which case that machine has a mind. So let's be clear: my objection is not that strong AI cannot happen. It is that it cannot happen in software/hardware systems, networked or stand-alone.
To make my point even more clear: I reject the doctrine of multiple realizability. I do not believe we can extract the mind from the neurological material that causes the subjective mental phenomena that characterize it, as if one could put a mind on a massive floppy disk and then load that "mental software" onto another substrate. I reject that idea as nothing more than a 21st century version of Cartesian mind/matter dualism.
The irony is that people who don't understand me call me the dualist, and suggest that I rather than they posit the existence of some mysterious mental substance that exists distinct from brain matter. I hope Jeff Davis catches this message.
-gts
More information about the extropy-chat
mailing list