[ExI] Mental Phenomena

John Clark johnkclark at gmail.com
Fri Feb 14 10:35:04 UTC 2020

On Thu, Feb 13, 2020 at 8:09 PM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

*> Being uploaded to something that isn't like anything, will NOT be the
> same.*

So you think am intelligent being should be judged not by the nature of his
character but by the degree of squishiness and wetness of his brain. Odd.

*> The fact of the matter is, redness is something.  We don't know yet,
> what,*

Yes redness is something and we do know what redness is, it's a label. And
if everything had the same label it would be useless, that's why there is
no such thing as elemental redness.

*> So this basically proves that a machine that is intentionally designed
> to be independent of such things (via lots of dictionaries.) aren't going
> to have any redness.*

Proves?! I still don't know what you mean by "lots of dictionaries" nor do
I understand why a voltage needs lots of them but a chemical doesn't; or
was it the other way around? And you haven't explained why random mutation
and natural selection can produce qualia but a intelligent designer (aka a
computer architect ) can't.

*> If we want a phenomenal simulation, we need to make simulated knowledge
> out of basement level physical*

Humans used their Evolution produced brains to design computers, and both
computers and brains make use of the exact same basement level physics.

*> this doesn't mean simulations, even phenomenal simulations, aren't going
> to be possible. If we don't want to worry about things like turning humans
> off, when we restart our simulations, and other such immoral stuff, we just
> do everything in an amoral abstract way, that isn't physically like
> anything.*

Brent, try asking yourself a different question, instead of asking how
you're conscious ask why. Why did Evolution go to the bother of making you
conscious? However important consciousness is to you and me to Evolution
it's irrelevant because Natural Selection can’t directly detect
consciousness any better than I can directly detect consciousness in other
people, but both I and Natural Selection CAN detect intelligent behavior.
So unless Darwin was dead wrong consciousness MUST be a byproduct of
intelligence. That’s why I get so impatient with consciousness theories
that just ignore intelligence. And that's why after saying consciousness is
the way data feels when it is being processed intelligently there is just
nothing much more that can be said about consciousness of any consequence.

I suppose it could be argued that maybe Evolution just got lucky and came
up with a sort of consciousness circuit by accident, but such a part would
not be stable. Consciousness by itself confers no adaptive advantage, only
intelligent behavior does that, so even if consciousness emerged by pure
chance millions of years ago by today it would be long gone due to genetic
drift, just as the eyes of creatures that have lived for thousands of
generations in dark caves have disappeared. And yet here I am, and although
I can’t prove it to you I know for a fact that I am conscious. So if the
“consciousness circuit” does nothing but generate consciousness it would be
gone by now. So intelligence and consciousness are inextricably linked and
the Turing Test also works for consciousness and not just intelligence.

Finally a critic could say that maybe Evolution came up with consciousness
because it was the simplest path (but not the only path) to intelligence,
but if so then we will also find it easier to make a intelligent conscious
computer than a intelligent non-conscious computer. So if you run across a
AI logically your default position should be to assume it's conscious.

 John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20200214/c4c8050b/attachment.htm>

More information about the extropy-chat mailing list