Skip to main content
  • Poster presentation
  • Open access
  • Published:

Self-organization of asymmetric associative networks


Associative networks serve as basic paradigms for memory retrieval [13] and have been used to describe properties of many neuronal structures [4, 5]. In contrast, learning via activity dependent adaptation of real synapses is still not well understood. In particular, the Hopfield Model [2] artificially enforces synapses to represent neuronal correlations according to the Hebbian rule that unrealistically leads to symmetric couplings, catastrophic forgetting when large amounts of patterns are to be stored and explosion of weights when the Hebbian rule is applied iteratively. Various modifications of the simple Hebbian rule have been proposed, as for instance the use of global information to control Hebbian contributions to the weight matrix [6] that prohibits catastrophic forgetting and weight explosion. In contrast, we present a local learning rule that in large networks of N neurons can stabilize more than 1.6 N binary non-sparse random patterns (see Figure 1). When applied in an on-line fashion, it leads to retention of the stack of recent patterns without attrition. The synaptic algorithm turns out to be consistent with spike-timing dependent plasticity as observed in hippocampus and cortex. In fact, it resembles the perceptron rule [3] by modifying a synapse only when the postsynaptic neuron changes its activity. We also find that the mutual interaction of network dynamics with weight changes confines synaptic strength to finite values despite the formal instability of the local synaptic learning rule. Taken together our work suggests that stabilization could provide a unifying principle of weight-activity co-evolution, leading to large storage capacity, on-line ability, generalization and extraction of higher order correlations which has testable implications for synaptic dynamics and cortical function.

Figure 1
figure 1

Retrieval of stored patterns depending on the relative number P / N of patterns learned in a network of N neurons.


  1. Palm G: Neural Assemblies. An Alternative Approach to Artificial Intelligence. 1982, Springer, Berlin, Heidelberg, New York

    Google Scholar 

  2. Hopfield JJ: Neural networks and physical systems with emergent collective computational abilities. Proc NatL Acad Sci USA. 1982, 79: 2554-2558. 10.1073/pnas.79.8.2554.

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  3. Hertz JA, Krogh A, Palmer RG: Introduction to the Theory of Neural Computation. 1991, Addison Wesley

    Google Scholar 

  4. Myashita Y: Inferior temporal cortex: Where visual perception meets memory. Ann Rev Neurosci. 1993, 16: 245-263. 10.1146/

    Article  Google Scholar 

  5. Amit DJ, Brunel N, Tsodyks MV: Correlations of cortical Hebbian reverberations: theory versus experiment. J Neuroscience. 1994, 14: 6435-6445.

    CAS  Google Scholar 

  6. Blumenfeld B, Preminger S, Sagi D, Tsodyks M: Dynamics of memory representations in networks with novelty-facilitated synaptic plasticity. Neuron. 52: 383-394. 10.1016/j.neuron.2006.08.016.

Download references


Many thanks to Misha Tsodyks for encouraging discussions.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Klaus Pawelzik.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Albers, C., Pawelzik, K. Self-organization of asymmetric associative networks. BMC Neurosci 10 (Suppl 1), P340 (2009).

Download citation

  • Published:

  • DOI: