[ExI] Digital Consciousness .

John Clark johnkclark at gmail.com
Thu Apr 25 18:50:51 UTC 2013


On Wed, Apr 24, 2013  Gordon <gts_2000 at yahoo.com> wrote:

> Intentionality, as I and most philosophers use the word, is in basic
> terms the ability to have mental content
>

Then we know that other people must have intentionality because without
mental representations they couldn't design a bridge that wouldn't fall
down, and in the same way we know that computers must have internal
representations of external reality or they couldn't solve physical
problems.

> of which one is aware.
>

Mental representations would be unusable if you didn't know you had them,
and a file in a computer would be unusable to the machine if it didn't know
it had it, if it didn't know where in its vast memory banks it was located.

> Strictly speaking, I cannot prove that you have intentionality
>

Yes you can by observing behavior. Internal representations of external
physical systems are necessary to solve physical problems involving those
systems, so if anyone or anything can solve those problems it MUST have
those internal representations AND know where to find them.

> we can also wear sunglasses to change our visual inputs. Are we supposed
> to think sunglasses are conscious? [...] My watch tells me the time
> accurately, but I'm pretty sure it has no idea what the time is.
>

A lot of the arguments that the "electronics can never work as well as
meat" gang offer take this form, X can do Y but "obviously" X isn't
conscious like people. But exactly why is it obvious? Because in the
examples given X can do only one thing. When your friend tells you the time
you're pretty sure he knows what time it is because his behavior is far
more complex than the watch and he can do more than one thing.

> we cannot make identical digital copies of things that are not digital.
>

But we can make copies of things involving matter or energy or spin or
electrical charge and we can make copies of the finite integer number of
neurons in the brain communicating with each other with a finite integer
number of sodium and potassium ions. That doesn't sound like much of a
restriction.

> In principle, I think we can digitally sim a brain, even down to the
> molecular level.
>

Yes, but that would be vast overkill.

> I seriously doubt that my computer is anything more than a blind,
> unconscious machine. It might act intelligently but [...]
>

I don't know where people get the silly idea that intelligence is easier to
produce than consciousness. And there is no easier job in the world than
being a consciousness theorist because your theories don't have to actually
do anything, and there is no harder job than being a intelligence theorist
because those ideas must do a hell of a lot.

> Someday a digital computer might pass the Turing test,


And after seeing what Watson can do I think that day could come a lot
sooner than a lot of people think.

> and we will have created weak AI. But strong (conscious, intentional) AI
> is a completely different challenge.
>

Completely different?! It is really astonishing that well into the 21'st
century there are still people who say they believe in Darwin's Theory of
Evolution and even claim to understand it and yet can say something like
that. Evolution produced Gordon Swobe  and Gordon Swobe is conscious; those
2 facts are absolutely positively 100% logically inconsistent with what you
just said above.

I have been pointing this fact out for well over a decade but it seldom
makes much of a effect; this should bring about a profound change in
somebody's world-view but it never does, people either give a
embarrassingly anemic retort or just shrug it off and continue to firmly
believe 2 contradictory things, that Evolution is true and that
consciousness and intelligent behavior are unrelated. Why? I'm not a
psychiatrist but I think it's because people first decide that a computer
can definitely never be conscious and only then go looking for evidence to
support their prejudice. And a belief that is not based on logic can not be
destroyed by it.

  John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20130425/6ea0595a/attachment.html>


More information about the extropy-chat mailing list