Skip to main content

Should Hebbian learning be selective for negative excess kurtosis?

Within the Hebbian learning paradigm, synaptic plasticity results in potentiation whenever pre- and postsynaptic activities are correlated, and in depression otherwise. This requirement is however not sufficient to determine the precise functional form for Hebbian learning, and a range of distinct formulations have been proposed hitherto. They differ, in particular, in the way runaway synaptic growth is avoided; by either imposing a hard upper bound for the synaptic strength, overall synaptic scaling, or additive synaptic decay [1]. Here we propose [2] a multiplicative Hebbian learning rule which is, at the same time, self-limiting and selective for negative excess kurtosis (for the case of symmetric input distributions).

Hebbian learning results naturally in a principal component analysis (PCA), whenever one is present. Alternative formulations of the Hebbian learning paradigm differ however in other properties. Importantly they may, or may not, perform an independent component analysis (ICA), whenever one is feasible. The ICA may be achieved by maximizing (minimizing) the excess kurtosis, whenever the latter is positive (negative) [3]. Noting that naturally occurring individual cell lifetime firing rates are, however, characterized by having both a large kurtosis and a large skewness [4], we investigate in this paper the effect of the skewness and kurtosis on performing both PCA and ICA (see Figure 1) with several Hebbian learning rules. A particular emphasis is placed on the differences between additive and multiplicative schemes. We find that multiplicative Hebbian plasticity rules select both for small excess kurtosis and large skewness, allowing them to perform an ICA, in contrast to additive rules.

Figure 1
figure 1

The learning rule's ability to perform an independent component analysis is tested with the non-linear bars problem. An input set consisting of a random number of horizontal and vertical bars is fed to the neuron which, after training, becomes selective to individual bars or pixels, the independent components of the input set, even when such an input was never presented to the neuron in isolation.


  1. Abbott LF, Nelson SB: Synaptic plasticity: taming the beast. Nat Neurosci. 2000, 3: 1178-1183.

    Article  PubMed  CAS  Google Scholar 

  2. Echeveste R, Gros C: Generating Functionals for Computational Intelligence: The Fisher Information as an Objective Function for Self-Limiting Hebbian Learning Rules. Front Robot AI. 2014, 1: 1-

    Article  Google Scholar 

  3. Hyvärinen A, Oja E: Independent component analysis: algorithms and applications. Neural Netw. 2000, 13: 411-430.

    Article  PubMed  Google Scholar 

  4. Willmore B, Tolhurst DJ: Characterizing the sparseness of neural codes. Network-Comp Neural. 2001, 12: 255-270.

    Article  CAS  Google Scholar 

Download references


The support of the German Science Foundation (DFG) and the German Academic Exchange Service (DAAD) are acknowledged.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Claudius Gros.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gros, C., Eckmann, S. & Echeveste, R. Should Hebbian learning be selective for negative excess kurtosis?. BMC Neurosci 16 (Suppl 1), P65 (2015).

Download citation

  • Published:

  • DOI: