[ExI] Personal conclusions
aware at awareresearch.com
Thu Feb 4 18:20:45 UTC 2010
On Thu, Feb 4, 2010 at 3:59 AM, Stefano Vaj <stefano.vaj at gmail.com> wrote:
> On 3 February 2010 23:05, Gordon Swobe <gts_2000 at yahoo.com> wrote:
>> I have no interest in magic. I contend only that software/hardware systems as we conceive of them today cannot have subjective mental contents (semantics).
> Yes, this is clear by now.
> The bunch of threads of which Gordon Swobe is the star, which I have
> admittedly followed on and off, also because of their largely
> repetitive nature, have been interesting, albeit disquieting, for me.
Interesting to me too, as an example of our limited capability at
present, even among intelligent, motivated participants, to
effectively specify and frame disparate views and together seek a
greater, unifying context which either resolves the apparent
differences or serves to clarify them.
> Not really to hear him reiterate innumerable times that for whatever
> reason he thinks that (organic? human?) brains, while obviously
> sharing universal computation abilities with cellular automata and
> PCs, would on the other hand somewhat escape the Principle of
> Computational Equivalence.
Gordon exhibits a strong reductionist bent; he seems to believe that
Truth WILL be found if only one can see closely and precisely enough
into the heart of the matter.
Ironically, to the extent that he parrots Searle his logic is
impeccable, but he went seriously off that track when engaging with
Stathis in the neuron-replacement thought experiment. Most who engage
in this debate fall into the same trap of defending functionalism, and
this is where the Chinese Room Argument gets most of its mileage, but
functionalism, materialism and computationalism are really not at
issue. Searle quite clearly and coherently shows that syntax DOES NOT
entail semantics, no matter how detailed the implementation.
So at the sophomoric level representative of most common objections,
the debate spins around and around, as if Searle were <gasp> denying
functionalist, materialist, or computationalist accounts of reality.
He's not, and neither is Gordon. The point is that there's a paradox.
[And paradox is always a matter of insufficient context. In the
bigger picture, all the pieces must fit.]
John Clark jumps in to hotly defend Truth and his simple but circular
view that consciousness is a Fact, it obviously arrived via Evolution
thus Evolution is the key. And how dare you deny Evolution--or Truth!?
Stathis patiently (he has plenty of patients, as well as patience)
rehashes the defense of functionalism which needs no defending, and
although Gordon occasionally asserts that he doesn't disagree (on
this) he doesn't go far enough to acknowledge and embrace the apparent
truth of functionalist accounts WHILE highlighting the ostensible
paradox presented by Searle.
Eric and Spencer jump in (late in the game, if a merry-go-round can be
said to have a "later" point in the ride) and contribute the next
layer after functionalism: If we accept that we have "consciousness",
"and unquestionably we do", and we accept materialist, functionalist,
computationalist accounts of reality, then the answer is not to be
found in the objects being represented, but in the complex
associations between them. They too are correct (within their
context) but their explanation only raises the problem another level,
no closer to resolution.
> But because so many of the people having engaged in the discussion of
> the point above, while they may not believe any more in a religious
> concept of "soul", seem to accept without a second thought that some
> very poorly defined Aristotelic essences would per se exist
> corresponding to the symbols "mind", "consciousness", "intelligence",
> and that their existence in the sense above would even be an a priori
> not really open to analysis or discussion.
Yes, many of our "rationalist" friends decry belief in a soul, but
passionately defend belief in an essential self--almost as if their
self depended on it. And along the way we get essential [qualia,
experience, intentionality, free-will, meaning, personal identity...]
And despite accumulating evidence of the incoherence of consciousness,
with all its gaps, distortions, fabrication and confabulation, we hang
on to it, and decide it must be a very Hard Problem. Thus inoculated,
and fortified by the biases built in to our language and culture, we
know that when someone comes along and says that it's actually very
simple, cf. Dennett, Metzinger, Pollack, Buddha..., we can be sure,
even though we can't make sense of what they're saying, that they must
A few deeper thinkers, aiming for greater coherence over greater
context, have suggested that either all entities "have consciousness"
or none do. This is a step in the right direction. Then the
question, clarified, might be decided in simply information-theoretic
terms. But even then, more often they will side with Panpsychism
(even a rock has consciousness, but only a little) than to face the
possibility of non-existence of an essential experiencer.
> Now, if this is the case, I have sincerely troubles in finding a
> reason why we should not accept on an equal basis, the article of
> faith that Gordon Swobe proposes as to the impossibility for a
> computer to exhibit the same.
> Otherwise, we should perhaps reconsider a little non really the AI
> research programmes in place, but rather, say, the Circle of Vienna,
> Popper or Dennett.
Searle is right, in his logic. Wrong, in his premises. No formal
syntactic system produces semantics. Further, to the extent that the
human brain is formally described, no semantics will be found there
either. We never had it, and don't need it. "It" can't even be
defined in functional terms. The notion is incoherent, despite the
strength and seductiveness of the illusion.
It's simply and necessarily how any system refers to references to
itself. Yes, it's recursive, and therefore unfamiliar and unsupported
by a language and culture that evolved to deal with relatively shallow
context and linear relationships of cause and effect. Meaning is not
as perceived by the observer, but in the response of the observer,
determined by its nature within a particular context. Yes, it may
feel like a direct attack on the sanctity of Self, but it's not. It
destroys nothing that ever existed, and opens up thinking on agency
just as valid, extending beyond the boundaries of the cranium, or the
skin, or the organism plus its tools, or ...
Oh well. Baby steps...
More information about the extropy-chat