[ExI] Consciouness and paracrap

Stathis Papaioannou stathisp at gmail.com
Wed Feb 17 10:27:15 UTC 2010


On 17 February 2010 14:58, Anna Taylor <femmechakra at yahoo.ca> wrote:
> I know I'm simple minded but I don't understand why consciousness is such
> a philosophical debate.  I wonder why science sometimes complicates
> things.
>
> Let's say hypothetically I could change the definitions. (This is
> against academic rules but if Gordon GTE's to play master of definitions
> why shouldn't I:). Let's say the word conscious meant alive versus
> dead. Anything that is conscious is alive and awake.  Could everyone
> agree on that?
> The problem lies within determining the levels of consciousness.
> Does a worm possess consciousness? Will an AI?
>
> I'm rather curious as to understand why the scientific community is
> taboo against using the term the "subconscious" as it would be much
> easier to explain if they did. What if the subconscious is Darwin's
> Theory of Evolution?  The basic instinct for survival.  A tree
> requires many things to keep it alive but it doesn't know it.
> It depends on the sun, the earth, the rain and it will continue to
> grow or it will die.  We need trees and they are part of the
> evolutionary process. I would say they are subconsciously alive.
> In humans it's the instinct to take your hand off a hot stove.
> The embedded codes that evolution has installed.
>
> A person who is under anaesthesia should therefore be conscious and
> subconsciously alive. The person requires oxygen, food and water yet
> has no idea. This would then mean that consciousness and awareness
> go hand in hand.
>
> What if consciousness is to be awake allowing the memories capability
> of recalling, extracting and processing information while awareness is
> intelligence, experience and sense?  A worm doesn't live in a
> subconscious state, it recalls, extracts and processes information
> but does it recall the experience or have a sense as to why it does
> the things it does? Shouldn't this be an underlining question? If we
> knew the worm felt something when we poked it with a knife would we
> declare it "aware"?  I know my cat is aware because once he decided to
> stick his head too far into a bottle, he never did it again.  I think
> to be human is to have awareness as well as consciousness.
>
> I believe a strong as well as weak AI will not be conscious or
> have any subconscious (well at least until technology merges with
> biology then at least that will be a great philosophical discussion)
> but only strong AI will have consciousness and somewhat awareness.
> What is scary about Strong AI is that it may have the maximum capacity
> of intelligence yet have no experience or sense.  We had better hope
> that the programmer is fully aware.
>
> Stathis Papaioannou stathisp at gmail.com
> Sun Dec 13 23:08:02 UTC 2009 questioned gts_2000 at yahoo.com
>
>>To address the strong AI / weak AI distinction I put to you a
> question you haven't yet answered: what do you think would happen
> if part of your brain, say your visual cortex, were replaced with components that behaved normally in their interaction with the
> remaining biological neurons, but lacked the essential ingredient for
> consciousness?
>
> My observation:
> Well my contacts work fine. My memory would recall, extract and process
> the information.  If the contacts are too weak and I can't see then
> yes one of my awareness factors would be limited but it would not stop
> me from being conscious or have consciousness.
>
> Btw...Even if all my crazy posts don't amount to anything I have to
> say that the Extropy Chat creates a whirl of imagination.  I can read
> something that may lead to me to investigate something truly beneficial
> to my understanding.  Thanks.  Ok back to music...

Anna, here are some definitions that I use:

Consciousness - hard to define, but if you have it you know it;
Strong AI - an artificial intelligence that is both intelligent and conscious;
Weak AI - an artificial intelligence that is intelligent but lacks
consciousness;
Philosophical zombie - same as weak AI.

Several people have commented that we need a definition of
consciousness to proceed, but I disagree. I think everyone knows what
is meant by the word and so we can have a complete discussion without
at any point defining it. For those who say that consciousness does
not really exist: consciousness is that thing you are referring to
when you say that consciousness does not really exist.

With the brain replacement experiment, the idea is that the visual
cortex is where visual perceptions (visual experiences/ consciousness/
qualia) occur. If your visual cortex is destroyed then you are blind,
even if your eyes and optic nerve are intact. When you see something
and describe it, information goes from your eyes to your visual
cortex, from your visual cortex to your speech centre, and finally
from your speech centre to your vocal cords. The question is, what
would happen if your visual cortex were replaced with an artificial
part that sent the same signals to the rest of your brain in response
to signals from your eyes, but lacked visual perception? By
definition, you would see nothing; but also by definition, you would
describe everything put in front of you correctly and you would claim
and honestly believe that you could see normally. How could you be
completely blind but not notice you were blind and behave as if you
had normal vision? And if you think that is a coherent state of
affairs, how do you know you are not currently blind?

The purpose of the above is to show that it is impossible (logically
impossible, not just physically impossible) to make a brain part, and
hence a whole brain, that behaves exactly like a biological brain but
lacks consciousness. Either it isn't possible to make such an
artificial component at all, or else it is possible to make such a
component but it will necessarily also have consciousness. The
alternative is to say that you're happy with the idea that you may be
blind, deaf, unable to understand English etc. but neither you nor
anyone else has noticed.

Gordon Swobe's response is that this thought experiment is ridiculous
and I should come up with another one that doesn't challenge the
self-evident fact that digital computers cannot be conscious.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list