- Poster presentation
- Open access
- Published:
Should Hebbian learning be selective for negative excess kurtosis?
BMC Neuroscience volume 16, Article number: P65 (2015)
Within the Hebbian learning paradigm, synaptic plasticity results in potentiation whenever pre- and postsynaptic activities are correlated, and in depression otherwise. This requirement is however not sufficient to determine the precise functional form for Hebbian learning, and a range of distinct formulations have been proposed hitherto. They differ, in particular, in the way runaway synaptic growth is avoided; by either imposing a hard upper bound for the synaptic strength, overall synaptic scaling, or additive synaptic decay [1]. Here we propose [2] a multiplicative Hebbian learning rule which is, at the same time, self-limiting and selective for negative excess kurtosis (for the case of symmetric input distributions).
Hebbian learning results naturally in a principal component analysis (PCA), whenever one is present. Alternative formulations of the Hebbian learning paradigm differ however in other properties. Importantly they may, or may not, perform an independent component analysis (ICA), whenever one is feasible. The ICA may be achieved by maximizing (minimizing) the excess kurtosis, whenever the latter is positive (negative) [3]. Noting that naturally occurring individual cell lifetime firing rates are, however, characterized by having both a large kurtosis and a large skewness [4], we investigate in this paper the effect of the skewness and kurtosis on performing both PCA and ICA (see Figure 1) with several Hebbian learning rules. A particular emphasis is placed on the differences between additive and multiplicative schemes. We find that multiplicative Hebbian plasticity rules select both for small excess kurtosis and large skewness, allowing them to perform an ICA, in contrast to additive rules.
References
Abbott LF, Nelson SB: Synaptic plasticity: taming the beast. Nat Neurosci. 2000, 3: 1178-1183.
Echeveste R, Gros C: Generating Functionals for Computational Intelligence: The Fisher Information as an Objective Function for Self-Limiting Hebbian Learning Rules. Front Robot AI. 2014, 1: 1-
Hyvärinen A, Oja E: Independent component analysis: algorithms and applications. Neural Netw. 2000, 13: 411-430.
Willmore B, Tolhurst DJ: Characterizing the sparseness of neural codes. Network-Comp Neural. 2001, 12: 255-270.
Acknowledgements
The support of the German Science Foundation (DFG) and the German Academic Exchange Service (DAAD) are acknowledged.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
About this article
Cite this article
Gros, C., Eckmann, S. & Echeveste, R. Should Hebbian learning be selective for negative excess kurtosis?. BMC Neurosci 16 (Suppl 1), P65 (2015). https://doi.org/10.1186/1471-2202-16-S1-P65
Published:
DOI: https://doi.org/10.1186/1471-2202-16-S1-P65