[ExI] The simplest possible conscious system

Spencer Campbell lacertilian at gmail.com
Thu Feb 4 22:24:59 UTC 2010


Ben Zaiboc <bbenzai at yahoo.com>:
> Hm, interesting challenge.
>
> I'd probably define Intelligence as problem-solving ability, and
> Understanding as the association of new 'concept-symbols' with established ones.
>
> I'd take "Conscious" to mean "Self-Conscious" or "Self-Aware", which almost certainly involves a mental model of one's self, as well as an awareness of the environment, and one's place in it.

Somehow I was expecting people to radically disagree on these
definitions, but you actually have very similar conceptions of
consciousness, intelligence and understanding to my own.

Understanding is notably different in my mind, though: I'd say to have
a mental model of a thing is to understand that thing. Symbols don't
really enter into it, except that we use them as shorthand to refer to
understood models.

The more similarly your model behaves to the target system, the better
you understand that system!

Ben Zaiboc <bbenzai at yahoo.com>:
> I'd guess that the simplest possible conscious system would have an embodiment ('real' or virtual) within an environment, sensors, actuators, the ability to build internal representations of both its environment and itself, and by implication some kind of state memory.  Hm. maybe we already have conscious robots, and don't realise it!

I can conceive of a disembodied consciousness, interacting with its
environment only through verbal communication, which would be simpler.
Top that!

<jameschoate at austin.rr.com>:
> I would agree, however there is a couple of issues that must be addressed before it becomes meaningful.
>
> First, what is 'conscious'? That definition must not use human brains as an axiomatic measure.

I agree. The only problem is that, if consciousness exists, any
English definition of it would at least be inaccurate, if not outright
incorrect. We can only approximate the speed of light using feet, but
we can describe it exactly with meters.

I'm not even sure if consciousness is better considered as a binary
state, present or absent, or if we should be talking about degrees of
consciousness. Certainly, intelligence and understanding are both
scalar quantities. Is the same true of consciousness?

My current theory is that consciousness requires recursive
understanding: that is, understanding of understanding.
Meta-understanding. I don't know if it exhibits any emergent
properties over and above that, though, or if there are any other
prerequisites.



More information about the extropy-chat mailing list