Volume 9 Supplement 1

Seventeenth Annual Computational Neuroscience Meeting: CNS*2008

Open Access

An online Hebbian learning rule that performs independent component analysis

BMC Neuroscience20089(Suppl 1):O13

DOI: 10.1186/1471-2202-9-S1-O13

Published: 11 July 2008

The so-called cocktail party problem refers to a situation where several sound sources are simultaneously active, e.g. persons talking at the same time. The goal is to recover the initial sound sources from the measurement of the mixed signals. A standard method of solving the cocktail party problem is independent component analysis (ICA), which can be performed by a class of powerful algorithms. However, classical algorithms based on higher moments of the signal distribution [1] do not consider temporal correlations, i.e. data points corresponding to different time slices could be shuffled without a change in the results. But time order is important since most natural signal sources have intrinsic temporal correlations that could potentially be exploited. Therefore, some algorithms have been developed to take into account those temporal correlations, e.g. algorithms based on delayed correlations [2, 3] potentially combined with higher-order statistics [4], based on innovation processes [5], or complexity pursuit [6]. However, those methods are rather algorithmic and most of them are difficult to interpret biologically, e.g. they are not online or not local or require a preprocessing of the data.

Biological learning algorithms are usually implemented as an online Hebbian learning rule that triggers changes of synaptic efficacy based on the correlations between pre- and postsynaptic neurons. A Hebbian learning rule, like Oja's learning rule [7], combined with a linear neuron model, has been shown to perform principal component analysis (PCA). Simply using a nonlinear neuron combined with Oja's learning rule allows one to compute higher moments of the distributions which yields ICA if the signals have been preprocessed (whitening) at an earlier stage [1]. Here, we are interested in exploiting the correlation of the signals at different time delays, i.e. a generalization of the theory of Molgedey and Schuster [3]. We will show that a linear neuron model combined with a Hebbian learning rule based on the joint firing rates of the pre- and postsynaptic neurons of different time delays performs ICA by exploiting the temporal correlations of the presynaptic inputs (Figure 1).
Figure 1

The sources s are mixed with a matrix C, x = Cs, x are the presynaptic signals. Using a linear neuron y = W x, the weights W are updated following the Hebbian rule, so that the postsynaptic signals y recover the sources s.

Authors’ Affiliations

CND, University of Ottawa


  1. Hyvaerinen A, Karhunen J, Oja E: Independent Component Analysis. 2001, Wiley-InterscienceView ArticleGoogle Scholar
  2. Ziehe A, Muller K: TDSEP – an efficient algorithm for blind separation using time structure. Proc Int Conf on Art Neur Net. 1998Google Scholar
  3. Molgedey L, Schuster H: Separation of a mixture of independent signals using time delayed correlations. Phys. Rev. Lett. 1994, 72: 3634-3637. 10.1103/PhysRevLett.72.3634.View ArticlePubMedGoogle Scholar
  4. Mueller K, Philips P, Ziehe A: JADE TD: Combining higher-order statistics and temporal information for blind source separation (with noise). Proc Int Workshop on ICA. 1999Google Scholar
  5. Hyvaerinen A: Independent component analysis for time-dependent stochastic processes. Proc Int Conf on Art Neur Net. 1998Google Scholar
  6. Hyvaerinen A: Complexity pursuit: Separating interesting components from time-series. Neural Comput. 2001, 13: 883-898. 10.1162/089976601300014394.View ArticleGoogle Scholar
  7. Oja E: A simplified neuron model as principal component analyzer. J Math Biology. 1982, 15: 267-273. 10.1007/BF00275687.View ArticleGoogle Scholar


© Clopath et al; licensee BioMed Central Ltd. 2008

This article is published under license to BioMed Central Ltd.