[ExI] Some new angle about AI

Lee Corbin lcorbin at rawbw.com
Tue Jan 5 05:41:10 UTC 2010


Stefano and Stathis, respectively, wrote:

> "Consciousness" being hard to define as else than a social construct
> and a projection (and a pretty vague one, for that matter, inasmuch as
> it should be extensible to fruitflies...), the real point of the
> exercise is simply to emulate "organic-like" computational abilities
> with acceptable performances, brain-like architectures being
> demonstrably not too bad at the task.

What the key question is, is whether or not you would
choose to be uploaded given a preview of the resulting
machinery. It's what all these discussions are really
all about. As for me, so long as there is a *causal*
mechanism (i.e. information flow from state to state,
with time being a key element), and it will produce
behavior that is within the range of normal behavior
for me, then I'm on board.

Stathis:

> I can't define or even describe the taste of salt, but I know what I
> have to do in order to generate it, and I can tell you whether an
> unknown substance tastes salty or not. That's what I want to know
> about consciousness in general: I can't define or describe it, but I
> know it when I have it, and I would like to know if I would still have
> it after undergoing procedures such as brain replacement.

Yes, that's it. It is logically conceivable, after all, as
several on this list maintain, that every time you replace
any biologically operating part with a mechanism that, say,
does not involve chemical transformations, then your
experience is diminished proportionally, with the end
result that any non-biological entity actually has none
of this consciousness you refer to. While *logically*
possible, of course, I consider this possibility very
remote.

Lee




More information about the extropy-chat mailing list