[ExI] Digital Consciousness .
Brent Allsop
brent.allsop at canonizer.com
Wed Apr 24 18:50:19 UTC 2013
People are talking past each other, big time, when Gordon says:
>...I remain convinced that digital simulations can never have
consciousness, and that strong (conscious) AI and uploading is impossible
on digital computers...
to which John replies:
> Your belief, and all beliefs for that matter, that uploading is impossible
Surely, Gordon, you believe we will be able to reproduce what the brain
does in some artificial way, and that we'll be able to improve and
re-architect that stuff significantly, and that we'll do the same for
ourselves, essentially moving to a new, trillions of times more capable
system i.e. uploading right? So, everyone, please stop talking past each
other like this.
The only important part, if you ask me, is that abstracted information,
like ones and zeros, must have some hardware interpretation system, in
order to properly interpret whatever is representing ones and zeros, into
the correct ones and zeros. And a second level of interpretation is
required, when you convert a set of such abstract ones and zeros, as if it
should be representing a phenomenal quality like redness, (so that a system
can say that is red) and so on.
While a redness quality is just a redness quality, no interpretation
required. Sure, a redness quality can, itself, be interpreted, abstractly,
to represent things like "stop" or the word 'red' and the properties of a
ripe strawberry, but with phenomenal knowledge, what it is fundamentally
qualitatively like, is all important, and why we 'intentionally pick it'
unlike abstracted digital knowledge which, by design, as abstracted away
from whatever is being interpreted as representing it, so it, also, can be
'intentionally' aware of it and pick it in a more capable and intelligent
abstracted way.
Brent Allsop
On Wed, Apr 24, 2013 at 12:07 PM, spike <spike at rainier66.com> wrote:
>
>
> -----Original Message-----
> From: extropy-chat-bounces at lists.extropy.org [mailto:
> extropy-chat-bounces at lists.extropy.org] On Behalf Of Gordon
>
>
> >...mostly with acolytes of Daniel Dennett...
>
> Ja, I don't feel Dennett has reeled in that fish either. More on that
> later, gotta make this one fast.
>
> >...I remain convinced that digital simulations can never have
> consciousness, and that strong (conscious) AI and uploading is impossible
> on digital computers...
>
> Ja plenty of people feel that way, but I don't understand. If we can
> model dynamic systems with finite element models, and we can model every
> micron of a dendrite with its chemical environment (and I see no absolute
> reason why not) and we can model a synapse and we can model a neuron, etc,
> do explain why we cannot in principle model a neuron with a bunch of
> dentrites connected to it and the interconnect and all that stuff. I know
> it is a crazy difficult problem, but in principle, given enough computing
> horsepower and enough information interchange and enough time, why can we
> not make a connectome? Never mind for now modeling a specific brain, why
> could we not model a generic one? And if we recognize that a human brain
> is just too complicated, can we model a mouse brain? How about an
> earthworm or a flea?
>
> >...To John Clark's point: yes it is true that nature has built these
> machines that have consciousness, you and I being among them, but I think
> these machines that we call humans are not akin to digital computers.
> -Gordon
>
> _______________________________________________
>
> I agree humans are not akin to digital computers. What I am asking is if
> we can take a buttload of digital computers, connect them all together,
> each running models of brain cells, and create something that is kinda
> sorta akin to a human brain? If not, how about some simpler but still
> conscious brain perhaps? Or if not conscious, at least reactive to its
> surroundings?
>
> spike
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20130424/362cc9a9/attachment.html>
More information about the extropy-chat
mailing list