Skip to main content

Advertisement

A model for structural plasticity in neocortical associative networks trained by the hippocampus

The hippocampal formation plays a crucial role in organizing cortical long-term memory. It is believed that the hippocampus is capable of fast (one-shot) learning of new episodic information followed by extensive time periods where corresponding neocortical representations are trained and "compressed" [1]. Here, compression usually refers to processes such as chunking spatially and temporally distributed activity patterns. We take the complementary approach and optimize the synaptic network by structural plasticity, e.g., replacing unused synapses, thereby making full use of the potential connectivity [2].

We apply the frameworks of structural plasticity and hippocampus-induced learning to the training of neocortical associative networks [3]. Associative networks such as the Hopfield or Willshaw model are at the heart of many cortex theories and have been analyzed for a long time with respect to information storage capacity and plausible retrieval strategies [3, 4]. For example, it is well known that a completely connected network can store about 0.7 bits per synapse. However, for incompletely connected networks the capacity per synapse can be massively reduced or even vanish, depending on the retrieval algorithm [4].

In this work we analyze how structural processes and synaptic consolidation [5] during hippocampal training can improve the performance of neocortical associative networks by emulating full (or increased) synaptic connectivity. In our model the hippocampus can store a set of activity patterns by one-shot learning. Then the hippocampus trains the neocortex by repeatedly replaying the patterns in a sequence. The synapses of the neocortical network are consolidated depending on Hebbian learning. In each time step a fraction of the unconsolidated synapses are removed and replaced by the same number of new synapses at random locations thereby maintaining total connectivity. We show that this procedure can massively increase the synaptic capacity of a cortical macrocolumn (factor 10–20 or even up to factor 200 for pattern capacity). In a second step we analyze the model with respect to the time (or number of repetitions) necessary to increase effective connectivity from base level to a desired level. The analysis shows that acceptable training time requires a certain fraction of unconsolidated synapses to keep the network plastic.

References

  1. 1.

    Buzsaki G: The hippocampo-neocortical dialogue. Cerebral Cortex. 1996, 6: 81-92. 10.1093/cercor/6.2.81.

  2. 2.

    Stepanyants A, Chklovskii DB: Geometry and structural plasticity of synaptic connectivity. Neuron. 2002, 34: 275-288. 10.1016/S0896-6273(02)00652-9.

  3. 3.

    Knoblauch A: On compressing the memory structures of binary neural associative networks. Technical Report HRI-EU 06-02. 2006, Honda Research Institute Europe, Offenbach, Germany

  4. 4.

    Graham B, Willshaw D: Improving recall from an associative memory. Biological Cybernetics. 1995, 72: 337-346.

  5. 5.

    Fusi S, Drew PJ, Abbott LF: Cascade models of synaptically stored memories. Neuron. 2005, 45: 599-611. 10.1016/j.neuron.2005.02.001.

Download references

Author information

Correspondence to Andreas Knoblauch.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and Permissions

About this article

Keywords

  • Retrieval Algorithm
  • Retrieval Strategy
  • Structural Plasticity
  • Extensive Time Period
  • Hebbian Learning