[ExI] Newbie Question: Consciousness and Intelligence

Spencer Campbell lacertilian at gmail.com
Fri Feb 12 22:42:13 UTC 2010

Christopher Luebcke <cluebcke at yahoo.com>:
> I was wondering, given the lively back and forth I've seen on this list, whether the participants are agreed on the meanings of the terms "consciousness" and "intelligence".

I've been bandying about my own personal definition of intelligence,
at least, for something on the order of two weeks. Below is a segment
of one of my posts from eight days ago. It is startlingly apropos.

Spencer Campbell <lacertilian at gmail.com>:
> Stefano Vaj <stefano.vaj at gmail.com>:
>> ... very poorly defined Aristotelic essences would per se exist
>> corresponding to the symbols "mind", "consciousness", "intelligence" ...
> Actually, I gave a fairly rigorous definition for intelligence in an
> earlier message. I've refined it since then:
> The intelligence of a given system is inversely proportional to the
> average action (time * work) which must be expended before the system
> achieves a given purpose, assuming that it began in a state as far
> away as possible from that purpose.
> (As I said before, this definition won't work unless you assume an
> arbitrary purpose for the system in question. Purposes are roughly
> equivalent to attractors here, but the system may itself be part of a
> larger system, like us. Humans are tricky: the easiest solution is to
> say they swap purposes many times a day, which means their measured
> intelligence would change depending on what they're currently doing.
> Which is consistent with observed reality.)
> I can't give similarly precise definitions for "mind" or
> consciousness, and I wouldn't be able to describe the latter at all.
> Tentatively, I think consciousness is devoid of measurable qualities.
> This would make it impossible to prove its existence, which to my mind
> is a pretty solid argument for its nonexistence. Nevertheless, we talk
> about it all the time, throughout history and in every culture. So
> even if it doesn't exist, it seems reasonable to assume that it is at
> least meaningful to think about.

More information about the extropy-chat mailing list