[ExI] Personal conclusions

Stefano Vaj stefano.vaj at gmail.com
Thu Feb 4 11:59:50 UTC 2010


On 3 February 2010 23:05, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> I have no interest in magic. I contend only that software/hardware systems as we conceive of them today cannot have subjective mental contents (semantics).

Yes, this is clear by now.

The bunch of threads of which Gordon Swobe is the star, which I have
admittedly followed on and off, also because of their largely
repetitive nature, have been interesting, albeit disquieting, for me.

Not really to hear him reiterate innumerable times that for whatever
reason he thinks that (organic? human?) brains, while obviously
sharing universal computation abilities with cellular automata and
PCs, would on the other hand somewhat escape the Principle of
Computational Equivalence.

But because so many of the people having engaged in the discussion of
the point above, while they may not believe any more in a religious
concept of "soul", seem to accept without a second thought that some
very poorly defined Aristotelic essences would per se exist
corresponding to the symbols "mind", "consciousness", "intelligence",
and that their existence in the sense above would even be an a priori
not really open to analysis or discussion.

Now, if this is the case, I have sincerely troubles in finding a
reason why we should not accept on an equal basis, the article of
faith that Gordon Swobe proposes as to the impossibility for a
computer to exhibit the same.

Otherwise, we should perhaps reconsider a little non really the AI
research programmes in place, but rather, say, the Circle of Vienna,
Popper or Dennett.

-- 
Stefano Vaj



More information about the extropy-chat mailing list