[ExI] Vermis ex machina

Anders Sandberg anders at aleph.se
Wed Mar 4 09:38:18 UTC 2015

John Clark <johnkclark at gmail.com> , 3/3/2015 4:29 PM:

On Tue, Mar 3, 2015  Anders Sandberg <anders at aleph.se> wrote:

>> IMost think that Long Term Potentiation is the molecular basis of memory and in the January 28 1994 issue of Science Dan Madison and Erin Schuman found that Long Term Potentiation spreads out, by diffusion of Nitric Oxide (NO), over several cell diameters;  so you have lots of copies of the same identical information, so a single synapse can't be the equivalent of one bit of information, instead a bunch of potentiated synapses work together to store that one bit of information.

> How well have that actually held up? There was a lot of interest in it back in the 90s, but I have not seen much mention of it over the past 15 years. There are a few papers talking about lateral LTP like http://www.ncbi.nlm.nih.gov/pubmed/25260706 but most just talk about NO as relevant locally for LTP. 

The Nitric Oxide would only diffuse over a few cell diameters but each neuron has about 1000 synapses so that could include a lot of synapses. I'm not sure if that would be called local or not.   

Well, the diffusion distance in neuropil is about a micron unless there are channeling factors (they can boost it at least to 30 microns)http://onlinelibrary.wiley.com/doi/10.1111/j.1460-9568.2008.06285.x/full
This is enough to cover clusters of synapses

but not really an entire cell - after all, neurons have dendrites reaching many microns away, and axons that can literally go anywhere in the CNS. So I doubt it would tie the synapses on a cell together that strongly. 

 > you could have one bit per synapse on average but distributed across a few neighbours: their potentiation levels would contain a mixture of several bits, individually retrievable by the right stimulation pattern.

Maybe, but it seems to me that with a system like that you'd have the worst of both worlds. You'd have inefficient and slow storage because before making a new memory you'd have to make sure it didn't randomly change an existing memory, but you'd have little or none of the sort of redundancy that could be easily used for error correction. But of course just because it's a crazy primitive design is no guarantee that Evolution didn't decide to do things that way because evolutionary winners don't have to be the best possible they just have to be better than the competition. 

Actually, trying it implement error correction in ways beyond redundancy is rare in evolved systems - such solutions tend to be brittle. 

NO dynamics is messy, as the papers above show. No reason to think it is doing anything particularly elegant.

Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20150304/70295ab5/attachment.html>

More information about the extropy-chat mailing list