Volume 9 Supplement 1

## Seventeenth Annual Computational Neuroscience Meeting: CNS*2008

# The effect of Hebbian plasticity on the attractors of a dynamical system

- Junmei Zhu
^{1}Email author

**9(Suppl 1)**:P99

**DOI: **10.1186/1471-2202-9-S1-P99

© Zhu; licensee BioMed Central Ltd. 2008

**Published: **11 July 2008

A central problem in neuroscience is to bridge local synaptic plasticity and the global behavior of a system. It has been shown that Hebbian learning of connections in a feedforward network performs PCA on its inputs [1]. In recurrent Hopfield network with binary units, the Hebbian-learnt patterns form the attractors of the network [2]. Starting from a random recurrent network, Hebbian learning reduces system complexity from chaotic to fixed point [3].

In this paper, we investigate the effect of Hebbian plasticity on the attractors of a continuous dynamical system. In a Hopfield network with binary units, it can be shown that Hebbian learning of an attractor stabilizes it with deepened energy landscape and larger basin of attraction. We are interested in how these properties carry over to continuous dynamical systems.

where *x*_{
i
}is a real variable, and *f*_{
i
}a nondecreasing nonlinear function with range [-1,1]. *T* is the synaptic matrix, which is assumed to have been learned from orthogonal binary ({1,-1}) patterns ξ^{
μ
}, by the Hebbian rule: $T={\displaystyle \sum _{\mu}{\mathrm{\xi}}^{\mu}{\mathrm{\xi}}^{\mu T}}$. Similar to the continuous Hopfield network [4], ξ^{
μ
}are no longer attractors, unless the gains *g*_{
i
}are big. Assume that the system settles down to an attractor *X**, and undergoes Hebbian plasticity: *T'* = *T* + *εX***X**^{
T
}, where *ε* > 0 is the learning rate. We study how the attractor dynamics change following this plasticity.

We show that, in system (1) under certain general conditions, Hebbian plasticity makes the attractor move towards its corner of the hypercube. Linear stability analysis around the attractor shows that the maximum eigenvalue becomes more negative with learning, indicating a deeper landscape. This in a way improves the system's ability to retrieve the corresponding stored binary pattern, although the attractor itself is no longer stabilized the way it does in binary Hopfield networks.

## Declarations

### Acknowledgements

Supported by EU project FP6-2005-015803 "Daisy", the Hertie Foundation and the Volkswagen Foundation.

## Authors’ Affiliations

## References

- Linsker R: Self-organization in a perceptual network. Computer. 1988, 21: 105-117. 10.1109/2.36.View ArticleGoogle Scholar
- Hopfield J: Neural networks and physical systems with emergent collective computational properties. Proceedings of the National Academy of Sciences of the USA. 1982, 79: 2554-2588. 10.1073/pnas.79.8.2554.PubMed CentralView ArticlePubMedGoogle Scholar
- Cessac B, Samuelides M: From neuron to neural networks dynamics. The European Physical Journal – Special Topics. 2007, 142 (1): 7-88. 10.1140/epjst/e2007-00058-2.View ArticleGoogle Scholar
- Hopfield J: Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the National Academy of Sciences of the USA. 1984, 81: 3088-3092. 10.1073/pnas.81.10.3088.PubMed CentralView ArticlePubMedGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd.