|How Does The Brain Hold So Many Memories?|
|SciMed - Neuroscience|
|TS-Si News Service|
|Friday, 14 September 2007 19:05|
Perhaps multiple, weakly coupled networks could do the job
And then there are all the little details that stretch back decades – the house we grew up in, the time we spilled orange juice on our test back in third grade, and — for some of us — the solution to a quadratic equation.
So where do we put it all" If we had hard drives in our heads, the answer would be easy: we would store memories as 0s and 1s. But we don't, we have neurons, connected by synapses, and storing memories in such systems is a lot harder than putting 0s and 1s on a hard drive.
Nevertheless, about two decades ago John Hopfield showed that memories could be stored by modifying the strength of synapses in a particular way. Importantly, the number of memories that could be stored using his scheme was proportional to the number of neurons in the network. This solved the storage problem: there are about 50 million neurons in a cubic centimeter of cortex, plenty of room for both a vocabulary and spilled orange juice.
Moreover, they provided evidence that the constant of proportionality is small, not more than a few percent, and they eliminated one of theorists' favorite tricks — reducing the number of neurons involved in any one memory — for increasing that constant. Thus, if networks use the algorithm proposed by Hopfield, they can store at most about 500 memories, no matter how many neurons they contain.
So we're not exactly back to square one, but we're not much farther than square two: we no longer know how the brain holds so many memories. Roudi and Latham speculate that the answer lies in multiple, weakly coupled networks.
However, until that, or some other idea, is shown to be correct, we will have to be content with just remembering, without the added knowledge of how we remember.
This work was supported by the Gatsby Charitable Foundation and by the US National Institute of Mental Health.
PLoS Computational Biology is the official journal of the International Society for Computational Biology (ISCB). It is an open access, peer-reviewed journal published weekly by the Public Library of Science (PLoS).
A Balanced Memory Network. Roudi Y, Latham, PE (2007). PLoS Comput Biol 3(9): e141. doi:10.1371/journal.pcbi.0030141.
Author Summary. A critical component of cognition is memory—the ability to store information, and to readily retrieve it on cue. Existing models postulate that recalled items are represented by self-sustained activity; that is, they are represented by activity that can exist in the absence of input. These models, however, are incomplete, in the sense that they do not explain two salient experimentally observed features of persistent activity: low firing rates and high neuronal variability. Here we propose a model that can explain both. The model makes two predictions: changes in synaptic weights during learning should be much smaller than the background weights, and the fraction of neurons selective for a memory should be above some threshold. Experimental confirmation of these predictions would provide strong support for the model, and constitute an important step toward a complete theory of memory storage and retrieval.
| PDF |
|Last Updated on Friday, 14 September 2007 19:13|