- Poster presentation
- Open Access
- Published:
The effect of Hebbian plasticity on the attractors of a dynamical system
BMC Neuroscience volume 9, Article number: P99 (2008)
A central problem in neuroscience is to bridge local synaptic plasticity and the global behavior of a system. It has been shown that Hebbian learning of connections in a feedforward network performs PCA on its inputs [1]. In recurrent Hopfield network with binary units, the Hebbian-learnt patterns form the attractors of the network [2]. Starting from a random recurrent network, Hebbian learning reduces system complexity from chaotic to fixed point [3].
In this paper, we investigate the effect of Hebbian plasticity on the attractors of a continuous dynamical system. In a Hopfield network with binary units, it can be shown that Hebbian learning of an attractor stabilizes it with deepened energy landscape and larger basin of attraction. We are interested in how these properties carry over to continuous dynamical systems.
Consider system of the form
where x i is a real variable, and f i a nondecreasing nonlinear function with range [-1,1]. T is the synaptic matrix, which is assumed to have been learned from orthogonal binary ({1,-1}) patterns ξμ, by the Hebbian rule: . Similar to the continuous Hopfield network [4], ξμare no longer attractors, unless the gains g i are big. Assume that the system settles down to an attractor X*, and undergoes Hebbian plasticity: T' = T + εX*X*T, where ε > 0 is the learning rate. We study how the attractor dynamics change following this plasticity.
We show that, in system (1) under certain general conditions, Hebbian plasticity makes the attractor move towards its corner of the hypercube. Linear stability analysis around the attractor shows that the maximum eigenvalue becomes more negative with learning, indicating a deeper landscape. This in a way improves the system's ability to retrieve the corresponding stored binary pattern, although the attractor itself is no longer stabilized the way it does in binary Hopfield networks.
References
Linsker R: Self-organization in a perceptual network. Computer. 1988, 21: 105-117. 10.1109/2.36.
Hopfield J: Neural networks and physical systems with emergent collective computational properties. Proceedings of the National Academy of Sciences of the USA. 1982, 79: 2554-2588. 10.1073/pnas.79.8.2554.
Cessac B, Samuelides M: From neuron to neural networks dynamics. The European Physical Journal – Special Topics. 2007, 142 (1): 7-88. 10.1140/epjst/e2007-00058-2.
Hopfield J: Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the National Academy of Sciences of the USA. 1984, 81: 3088-3092. 10.1073/pnas.81.10.3088.
Acknowledgements
Supported by EU project FP6-2005-015803 "Daisy", the Hertie Foundation and the Volkswagen Foundation.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Zhu, J. The effect of Hebbian plasticity on the attractors of a dynamical system. BMC Neurosci 9 (Suppl 1), P99 (2008). https://doi.org/10.1186/1471-2202-9-S1-P99
Published:
DOI: https://doi.org/10.1186/1471-2202-9-S1-P99
Keywords
- Synaptic Plasticity
- Linear Stability Analysis
- Energy Landscape
- Maximum Eigenvalue
- Feedforward Network