[ExI] Digital Consciousness

Ben Zaiboc bbenzai at yahoo.com
Thu Apr 25 21:13:27 UTC 2013


John wrote:

>"It is really astonishing that well into the 21'st
>century there are still people who say they believe in Darwin's Theory of
>Evolution and even claim to understand it and yet can say something like
>that. Evolution produced Gordon Swobe  and Gordon Swobe is conscious; those
>2 facts are absolutely positively 100% logically inconsistent with what you
>just said above.
>
>I have been pointing this fact out for well over a decade but it seldom
>makes much of a effect; this should bring about a profound change in
>somebody's world-view but it never does, people either give a
>embarrassingly anemic retort or just shrug it off and continue to firmly
>believe 2 contradictory things, that Evolution is true and that
>consciousness and intelligent behavior are unrelated. Why? I'm not a
>psychiatrist but I think it's because people first decide that a computer
>can definitely never be conscious and only then go looking for evidence to
>support their prejudice. And a belief that is not based on logic can not be
>destroyed by it."

This is interesting.  How come, then, that you, I, and perhaps five or seven other people on here acknowledge and have changed their minds about it (certainly it was a change of mind in my case, once I had that "hang on, this makes no sense!" moment)?  Are there certain kinds of people that are swayed at a deep level by logic, and certain kinds of people that aren't?

I can definitely say that I had a 'belief' (with a small 'b') that my selfness was a unique, necessarily singular and undivisible thing, with the usual baggage of worrying about a copy really being me, and if I was copied, which one would 'really' be me, which one 'I' would wake up in, and if I was turned into a machine (ignoring the fact that I am already a machine!), would I cease to be aware, and all that nonsense, until I sat down and actually /thought/ about it, and realised that yes, it was in fact complete nonsense.

(I have to acknowledge the contribution that Linda Nagata's 'Vast' made in this process.  The concept of the ship pilot overwriting his own mind every ninety seconds, for hundreds of years, really made me think deeply about the whole issue, and I must admit it was challenging).

Maybe it's something to do with how desperately someone tends to hang on to their beliefs?  I'm open to persuasion, one way or the other (obviously!).  As long as it makes sense.


Gordon wrote:
> Only here on ExI do I encounter plenty of people who still hope for conscious, 
> intentional AI on digital computers, what I mean by strong AI. I assume this is because > of the belief or hope common to many Extropians that we might someday achieve digital 
> immortality.??


Nope, it's just the remorseless inevitability of logic.  Some of us don't believe things, we think them instead.

You say "hope for", which is an interesting way of putting it.  (I'd say 'wrong', but I'm too polite).  It's not hope, it's an acknowledgement of the facts, and where they lead.


Ben Zaiboc




More information about the extropy-chat mailing list