[ExI] Trilemma of Consciousness

John Clark johnkclark at gmail.com
Sun May 28 23:28:34 UTC 2017


On Sat, May 27, 2017  William Flynn Wallace <foozler83 at gmail.com> wrote:

​
> >
>> Maybe I missed something.  'Achieve' consciousness?  What does that even
> mean?
>

​It means making something that is conscious from something that is not. ​



> ​> ​
> In an AI, OK, got it.  In a person it makes utterly no sense.


​
We made the AI and Evolution made us, I don't see the problem. And Evolution
​
found it much easier to come up with feeling than the ability to
​
reason, it's certainly came up with it first. The most ancient part of the
brain, the spinal cord, the medulla and the pons is similar to the brain of
a fish or amphibian and
​
and is about  400 million years old,  it deals in aggressive behavior,
territoriality and social hierarchies.
​
The Limbic System is about 150 million years old
​
it
​
 is the source of awe and  exhilaration
​
and
​
is the active sight of many psychotropic drugs
​
. The amygdala, a part of the Limbic system has much to do with fear
​
and after some animals developed a Limbic system they started to take
​
care of their young, so it probably has something  to do with love too.

It is our grossly enlarged neocortex that makes the human brain so
unusual and so recent, it only started to get ridiculously large about 3
million years ago. It deals in deliberation, spatial perception, speaking,
reading, writing and mathematics, the one new emotion we got was worry,
probably because the neocortex is also the place where we plan for the
future.

If nature came up with feeling first and high level intelligence much
later, I don't see  why the opposite would be true for our computers. It's
probably a hell of a lot easier to make something that feels but doesn't
think than something that thinks but doesn't feel.

​
> >
>> Intelligence, depending on the definition
>

​Definitions are of trivial importance, examples are not. ​



> ​> ​
> Learning millions of facts does not make us smarter than encyclopedias.
>

​Books don't behave intelligently because books can't compute.

​> ​
> If we could agree on some definition of consciousness, then we might be
> able to program it, but not until then.


​I have a definition of consciousness and there will never be a better one,
​
consciousness
​ is the way John K Clark feels.​

​> ​
> So - what characteristics define consciousness in an AI?


​The same characteristics you use to infer consciousness to your fellow
human beings. ​



>> >  What characteristics do we have to find to determine whether a person
> is conscious?
>

​Intelligent ​behavior of course, that's why I think somebody taking a
calculus exam is probably conscious but somebody sleeping or under
anesthesia
​ or
 dead probably is not.

 John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20170528/cca03d2b/attachment.html>


More information about the extropy-chat mailing list