[ExI] Semiotics and Computability
msd001 at gmail.com
Fri Feb 19 02:20:34 UTC 2010
On Thu, Feb 18, 2010 at 7:54 PM, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> As you can see, this Champollion fellow started with some understanding. He built on that understanding to decipher more symbols. Presumably Thomas Young did the same before him.
> Too bad our man in the room has no understanding of any symbols and so no knowledge base to build on. He can do no more than follow the syntactic instructions in the program: if input = "squiggle" then output "squoogle".
Again I ask if the man is a metaphor or an actual man. If a metaphor,
then you might as well replace him with the mashed potatoes or some
other dumb machine - his person status only confuses the issue. If he
is an actual person, then we should assume he has enough a-priori
"knowledge base" for "interpret symbol" and "transform input to
output" etc. Suppose further that there is some amount of processing
ability beyond those required of the mashed-potato driven machine.
With that pattern recognition the man is able to notice the pattern
"squiggle->squoogle" so that he no longer needs to compare the shape
to the lookup/transformation rules because he's internalized that
knowledge (commit to memory, established onboard knowledgebase,
whatever) From outside the box we might observe that operation has
become optimized because it no longer takes 20 seconds to 'compute'
and now takes only 2. The man in the box might rather spend those 18
seconds pursing other goals - like checking his notes for what has
historically come next after the suiggle->squoogle transformation.
Even if there is an 80% observed case of input A and 20% B, there may
be a time when a third option is encountered. But the optimization to
expect A or B following squiggle->squoogle gives the man another
advantage in signal processing.
I didn't expect to describe it as such but this process is very
similar to what (i imagine) is going on in either a data compression
or realtime audio encoding application.
Are you going to argue that every action performed by the man in the
box is part of the transformation process? Are you taking aware his
apparent optimization intelligence by scripting his actions? If you
remove any responsibility from the man in the box, you move the
responsibility to the author of the script. That's programming. I
thought you said programming isn't intelligence... If I haven't
stated this position clearly, please explain.
More information about the extropy-chat