[ExI] Semiotics and Computability
stathisp at gmail.com
Thu Feb 11 11:28:56 UTC 2010
On 11 February 2010 00:41, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> Computers can reproduce just about any pattern. But a computerized pattern of a thing does not equal the thing patterned.
> I can for example reproduce the pattern of a tree leaf on my computer. That digital leaf will not have the properties of a real leaf. No matter what natural things we simulate on a computer, the simulations will always lack the real properties of the things simulated.
> Digital simulations of things can do no more than *simulate* those things. It mystifies me that people here believe simulations of organic brains should somehow qualify for an exception to this rule.
I'll only respond to this as John Clark has responded to your other points.
Would you say that a robot that seems to walk isn't really walking
because it is not identical to, and therefore lacks all of the
properties of, a human walking? The argument is not that a digital
computer is *identical* with a biological brain, otherwise it would be
a biological brain and not a digital computer. The argument is that
the computer can reproduce the consciousness of the brain if it is
able to reproduces the brain's behaviour. If it can't, you can't
explain what would happen instead, and your solution is to advise that
I change the question to one more to your liking.
> Neuroscientists should someday have at their disposal perfect digital simulations of brains to use as tools for doing computer-simulated brain surgeries. But according to you and some others, those digitally simulated brains will have consciousness and so might qualify as real people. This would mean medical students will have access to computer simulations of hearts to do simulated heart surgeries, but they won't have access to the same kinds of computerized tools for doing simulated brain surgeries. Those darned computer simulated brains won't sign the consent forms.
> People like me will want to do the simulated surgeries anyway. The Society for the Prevention of Simulated Cruelty to Simulated Brains will oppose me.
What if a race of robots landed on Earth and decided to do cruel
experiments on humans, on the assumption that mere organic matter
couldn't have a mind or feelings, despite behaving as if it did?
More information about the extropy-chat