[ExI] Automated black-box-based system design of unsupervised hyperintelligent learning systems
jonkc at bellsouth.net
Fri Sep 23 04:57:43 UTC 2011
On Wed, 9/21/11, Anders Sandberg <anders at aleph.se> wrote:
"Loose argument: there are 1e15 synapses in the brain, we need ~36 bits per synapse to encode where they connect. Plus a few bits of synaptic strength etc. So the information needed to describe a brain is of the order of 4e16 bits. "
But you'd almost certainly need a lot less than that to make a human like mind. Take a look at the January 28 1994 issue of Science, Dan Madison and Erin Schuman found that Long Term Potentiation spreads out over a large area so you have lots of copies of the same identical information, so a single synapse can't be the equivalent of one bit of information, instead a bunch of potentiated synapses work together to store that bit of information.
Below is the abstract of Madison and Schuman's paper:
"The long-lasting increase in synaptic strength known as long-term potentiation has been advanced as a potential physiological mechanism for many forms of both developmental and adult neuronal plasticity. In many models of plasticity, intercellular communication has been proposed to account for observations in which simultaneously active neurons are strengthened together. The data presented here indicate that long-term potentiation can be communicated between synapses on neighboring neurons by means of a diffusible messenger. This distributed potentiation provides a mechanism for the cooperative strengthening of proximal synapses and may underlie a variety of plastic processes in the nervous system."
John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat