[ExI] Do digital computers feel?

John Clark johnkclark at gmail.com
Fri Dec 23 19:21:01 UTC 2016


On Thu, Dec 22, 2016 at 4:57 PM, William Flynn Wallace <foozler83 at gmail.com>
wrote:
​



> ​> ​
> All I am trying to say is that we are talking about the most complex thing
> known to man and reducing it to code.


​The brain is complex, but not all that complex. Ray Kurzweil estimates
you'd need about 50 megabytes of code to emulate the behavior of a newborn
baby, and that seems about right to me. In the entire human genome there
are only 3 billion base pairs. There are 4 bases so each base can represent
2 bits, there are 8 bits per byte so that comes out to 750 meg. Just 750
meg. About half of that is for parts of the body other than the brain so
we're down to 375, and most of that 375 would be for basic metabolism that
any cell needs to stay alive but has nothing to do with information
processing,  and the genome is notorious for the inefficient way it encodes
information with long long stretches of repeats  (often instead of saying
something like  "write ABC 1001 times" it will actually write "ABC" 1001
times). So 50 meg seems about right for a seed AI, about the same as it
would take to record one Britney Spears song with good quality.


> ​> ​
> It just boggles my mind.


​Mine too, but that doesn't mean it's untrue.​


>
> ​>
> Even if you could hook up every neuron, every glial cell for recording
> purposes, and assuming that the hookups did not interfere with the
> functions (which I would very, very seriously doubt)
>

​Now you're talking about uploading not just AI, but even so it's just a
question of making sure the atoms of the correct element ​

are in the correct places.​

> No, every cell is just atoms and I agree that computers and people are
> alike in that way - no magical something to account for anything including
> consciousness.


​I'm glad to hear you say that.​


​
> >
> ​ ​
> But by your own logic, you could never tell if a computer program was
> conscious and could feel.
>

​True, ​

​but except for yourself you could never tell ​that *ANYTHING* is conscious
and can feel unless you accept certain axioms.

​> ​
> By my own logic, all we could do it sample behavior and induce, followed
> by deduction and further testing.
>

​Logic is useless unless there are axioms for that logic to work on. I have
2 axioms:

1) I an conscious.
2) Darwin was right about random mutation and natural selection being the
origin of species.

>From those axioms it's easy ​to use logic and deduce that intelligent
behavior must imply consciousness.  Those are my axioms, so what are your
axioms?


> ​> ​
> Suppose instead of uploading a real brain, it was built from the getgo
> with code - the way they are doing it now.  Now suppose that it passes all
> the Turing tests and whatever.  Would such an advanced computer be capable
> of lying?  Yes?


​Yes I agree, it could be lying, the AI could just be pretending to be
stupid when it's really smart as hell, maybe computers already do that and
have been foolin us for years, and maybe the same thing is true for rocks.
It would be much more difficult to do the reverse, I don't quite see how
Einstein could have just been pretending to be smart. ​

​>
> BTW -  connection between intelligence and consciousness.  There is no
> evidence that an amoeba has any memory,
>

​Untrue.​


https://www.newscientist.com/article/dn15068-smart-amoebas-reveal-origins-of-primitive-intelligence/


​ John K Clark​
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20161223/4fc440a2/attachment.html>


More information about the extropy-chat mailing list