Poster presentation | Open | Published:
Should Hebbian learning be selective for negative excess kurtosis?
BMC Neurosciencevolume 16, Article number: P65 (2015)
Within the Hebbian learning paradigm, synaptic plasticity results in potentiation whenever pre- and postsynaptic activities are correlated, and in depression otherwise. This requirement is however not sufficient to determine the precise functional form for Hebbian learning, and a range of distinct formulations have been proposed hitherto. They differ, in particular, in the way runaway synaptic growth is avoided; by either imposing a hard upper bound for the synaptic strength, overall synaptic scaling, or additive synaptic decay . Here we propose  a multiplicative Hebbian learning rule which is, at the same time, self-limiting and selective for negative excess kurtosis (for the case of symmetric input distributions).
Hebbian learning results naturally in a principal component analysis (PCA), whenever one is present. Alternative formulations of the Hebbian learning paradigm differ however in other properties. Importantly they may, or may not, perform an independent component analysis (ICA), whenever one is feasible. The ICA may be achieved by maximizing (minimizing) the excess kurtosis, whenever the latter is positive (negative) . Noting that naturally occurring individual cell lifetime firing rates are, however, characterized by having both a large kurtosis and a large skewness , we investigate in this paper the effect of the skewness and kurtosis on performing both PCA and ICA (see Figure 1) with several Hebbian learning rules. A particular emphasis is placed on the differences between additive and multiplicative schemes. We find that multiplicative Hebbian plasticity rules select both for small excess kurtosis and large skewness, allowing them to perform an ICA, in contrast to additive rules.
Abbott LF, Nelson SB: Synaptic plasticity: taming the beast. Nat Neurosci. 2000, 3: 1178-1183.
Echeveste R, Gros C: Generating Functionals for Computational Intelligence: The Fisher Information as an Objective Function for Self-Limiting Hebbian Learning Rules. Front Robot AI. 2014, 1: 1-
Hyvärinen A, Oja E: Independent component analysis: algorithms and applications. Neural Netw. 2000, 13: 411-430.
Willmore B, Tolhurst DJ: Characterizing the sparseness of neural codes. Network-Comp Neural. 2001, 12: 255-270.
The support of the German Science Foundation (DFG) and the German Academic Exchange Service (DAAD) are acknowledged.