[ExI] Digital Consciousness
Eugen Leitl
eugen at leitl.org
Mon May 6 12:50:51 UTC 2013
On Sun, May 05, 2013 at 04:58:57PM -0700, Gordon wrote:
> > As I keep saying (and you keep ignoring), something doesn't have to be digital for a digital computer to compute it.
>
> I'm not ignoring you, Ben. I just think that the answer is yes, it is trivially true that there is some description of the brain such that we could do a digital simulation of it. Church's thesis applies to brains; their operations are in principle computable. We can also create digital simulations of rain storms. This too is trivial. Nobody actually gets wet from those digital rain drops.
This is not for Gordon, but for people who get confused about
such things:
Water isn't wet. Wetness is strictly a property of reality
synthesized in the hardware between people's ears, using data
from sensors.
You can simulate the sensation in absence of any reality behind it by
direct stimulation of sensors.
In case of artificial reality the organism itself is embedded
in the simulation. It will experience wetness there, and a appropriately
detailed simulation of a nuclear fireball will result in nice
skin burns and destruction of unreal estate (fortunately, easily
restorable from a shapshot).
Apropos of virtual reality
http://news.sciencemag.org/sciencenow/2013/05/living-in-the-matrix-requires-le.html?ref=hp
Living in The Matrix Requires Less Brain Power
by Lizzie Wade on 2 May 2013, 3:30 PM | 2 Comments
Where am I? In virtual reality, rats use half as many neurons to navigate their environment.
Credit: UCLA Neurology
If you were a rat living in a completely virtual world like in the movie The Matrix, could you tell? Maybe not, but scientists studying your brain might be able to. Today, researchers report that certain cells in rat brains work differently when the animals are in virtual reality than when they are in the real world.
The neurons in question are known as place cells, which fire in response to specific physical locations in the outside world and reside in the hippocampus, the part of the brain responsible for spatial navigation and memory. As you walk out of your house every day, the same place cell fires each time you reach the shrub that's two steps away from your door. It fires again when you reach the same place on your way back home, even though you are traveling in the opposite direction. Scientists have long suspected that these place cells help the brain generate a map of the world around us. But how do the place cells know when to fire in the first place?
Previous research showed that the cells rely on three different kinds of information. First, they analyze "visual cues," or what you see when you look around. Then, there are what researchers call "self-motion cues." These cues come from how your body moves in space and are the reason you can still find your way around a room with the lights out. The final type of information is the "proximal cues," which encompass everything else about the environment you're in. The smell of a bakery on your way to work, the sounds of a street jammed with traffic, and the springy texture of grass in a park are all proximal cues.
In the real world, it's extremely difficult to tease out the influence of each kind of cue. But in a virtual reality environment, scientists are able to control the kinds of information available. In this latest experiment, rats anchored to the top of a ball ran in place as movielike images around them changed, creating the impression that they were running along a track. Their sense of place relied on visual cues from the projections and their self-motion cues, but they had to do without proximal cues like sound and smell.
When Mayank Mehta, a neurophysicist at the University of California (UC), Los Angeles, compared the activity of place cells in rats running along a real, linear track with place cell activity in the rats running in virtual reality, he saw some surprising differences. In the real world, about 45% of the rats' place cells fired at some point along the track. In virtual reality, only 22% did. "Half of the neurons just shut up," he says.
What's more, the place cells seemed to have a very different relationship to space in virtual reality than in the real world. Remember that place cell that fires when you've taken two steps away from your door on your way out of your house? On a real track, the rat's version of that neuron would fire when it had taken two steps away from the start, and then again when the animal reached the same spot on its return trip. But in virtual reality, something odd happened. Rather than firing a second time when the rat reached the same place on its return trip, the cells fired when the rat was two steps away from the opposite end of the track, the authors report online today in Science. That's like the same place cell in your brain firing when you've taken two steps away from your door and then when you've taken two steps away from your car. Instead of encoding a position in absolute space, the place cell seems to be keeping track of the rat's relative distance along the (virtual) track. Mehta calls this the "disto-code" and says, "This never happens in the real world."
Mehta suspects that these differences in place cell activity are related to virtual reality's lack of proximal cues. Perhaps, he posits, the neurons that shut off in virtual reality are the ones responsible for taking smells, sounds, and textures and turning them into information about where the rat is in space. And considering that when those cues disappear, the rat's cognitive map appears to change from one based on absolute space to one based on relative distance, proximal cues might be the key component to how those mental maps work in the real world. "As soon as proximal cues are present, they have veto power," Mehta explains. "They don't let the disto-code come about."
Loren Frank, a UC San Francisco neuroscientist who wasn't involved in the research, is impressed with Mehta's experiment and what it implies about the flexibility of the hippocampus's mapping system. But he cautions that the influence of proximal cues versus visual cues may be very different in rats and humans. "We have a tendency to assume that other organisms process the world in the same way that we do," he says. But unlike humans, "rats don't see terribly well." Instead, they rely heavily on smell and touch. So taking away proximal cues might affect them more dramatically than it would humans.
Daniel Dombeck, a neurobiologist at Northwestern University in Evanston, Illinois, who was not involved with the research, agrees that the new study is "suggestive" that the lack of proximal cues are responsible for many of the differences in rats' brains in virtual reality. "But I do think there's going to be debate about that. Future work is going to have to pin down exactly what the differences are [between virtual reality and the real world]." As a fellow virtual reality researcher (he designs Matrix-like worlds for mice), Dombeck is particularly excited about what Mehta's work shows about how to improve such simulations. "It's a really welcome addition to the growing field of rodent virtual reality."
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: Digital signature
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20130506/02b3a446/attachment.bin>
More information about the extropy-chat
mailing list