[ExI] How not to make a thought experiment

Will Steinberg steinberg.will at gmail.com
Sat Feb 20 04:59:04 UTC 2010


What has consistently boggled my mind is the fact that THE CRA, EVEN IN
THEORY, WILL NOT EVEN MIMIC CONSCIOUSNESS.

Upon closer examination, the idea of a "rulebook for responses" is bogus and
impossible.  Consciousness necessitates NON-BIJECTIVE, CHANGING RESPONSES,
ABSOLUTELY NOT NOT NOT SQUIGGLE FOR SQUAGGLE, or any of this, if I may pull
a Clark, BULLSHIT.

The analogy is wrong because it assumes an untruth and hides it behind the
neat idea of this rulebook.  Please stop using it to back up anything.  All
it proves is that a super-ELIZA is not conscious.  We know this.  The
rulebook machine is not conscious.  Humans are not rulebook machines.  No
comparing.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100219/d13ab788/attachment.html>


More information about the extropy-chat mailing list