[ExI] Vermis ex machina
Anders Sandberg
anders at aleph.se
Sat Mar 7 09:57:35 UTC 2015
Stuart LaForge <avant at sollegro.com> , 6/3/2015 11:02 PM:
> But this is cell failure. Synapses fail at proper transmission
> *nearly all the time*!?
> http://www.pnas.org/content/91/22/10380.full.pdf
> http://zadorlab.cshl.edu/PDF/zador-jn-mi.pdf
> Basically, there is a great deal of noise and variability introduced
> in synaptic transmission. The system is reliable since it uses many
> synapses and neurons, which are individually misbehaving a lot of
> the time.?
Might not this be a feature rather than a bug?
I have sat through more talksarguing this at computational neuroscience conferences than I can count. You can do nifty things using failure-prone parts. But proving that this is a feature and not just a bug/spandrel that the messy engineer evolution decided to use for other purposes it hard.
I mean computers behave very deterministically but people not so much.
Actually, I think people underestimate the indeterminism of real software. Sure, Turing machines and idealized single thread processors are deterministic (except for undecideability, Rice's theorem, chaotic dynamics etc.) but in practice a real computer has a surprising amount of indeterministic aspects (what happens when process A and B want to change the value of a variable at the same time?)
One cannot infer macroscale behavior from microscale components in general. The properties of synapses do not shine through to macroscale behavior most of the time, just as the properties of the atoms making up the body do not show up in how we act.
Delaying actions using inefficient transmission sounds like a bad solution, since it is inflexible: if the delay only depends on microscale properties and not on the actual situation/urgency, then you will react slowly when it really matters for survival too. In reality action selection seems to happen because of parallel inputs in the basal ganglia loops, and the overall strength of the input - mainly the number of axons firing - can make the selection faster. You could say noisy, lossy synapses still act as a barrier for weak thoughts turning into actions, but the opposite is also true: inhibiting bad actions is also done using noisy synapses that sometimes fail, and there is a constant barrage of random impulses not due to any proper thought in the first place.
A possible test of this hypothesis might be to compare the synaptic
efficiency of different species. My hypothesis would predict that less
complex organisms should have *more* efficient synapses since their
mode of dealing with stimulus would be more reflexive than intentional.
You would need to separate out the confounding factor of having fewer synapses to play with. We can make population codes that work fine with very noisy synapses since we have numerous axons, but an insect or C elegans will have just a few - they would need sharper synapses regardless of intentionality. There are also metabolic issues: http://www.nature.com/neuro/journal/v1/n1/full/nn0598_36.html - having inefficient synapses saves some energy, but there is a balance with information transmission rate (and the optimum will depend on what kind of organism you are).
Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20150307/9dfe7f71/attachment.html>
More information about the extropy-chat
mailing list