Do high order cross-correlations among neurons convey useful information?
© Senatore and Panzeri; licensee BioMed Central Ltd. 2007
Published: 6 July 2007
Populations of neurons in the brain encode information about the external environment as time-cell series of spikes. A fundamental question in systems neuroscience is how this information can be decoded by a downstream neural system. Since the responses of different neurons are statistically correlated , it is possible that such correlations convey important information and thus need to be taken into account by any decoding algorithm. Although coding by correlations may increase the capacity of a neural population to encode information, it may greatly complicate its decoding. In fact, it is possible that all neurons within the population interact with each other, and that their interaction cannot be described only in terms of "pair-wise" or low order interactions between neurons, but it reflects a genuine higher order interaction among a larger population. In such case, the number of parameters describing such correlations would increase exponentially with the population size drastically increasing the complexity of the codes we want to investigate. On the other hand, it is also possible that a downstream system can access all the information available in the population activity even when taking into account only low-order correlations among neurons. In this way, the brain could exploit some of the representational capacity offered by correlation codes, but at the same time limit the complexity needed to decode it.
Conceptualizing neurons as communications channels, we can quantify how much Shannon's mutual information, I, is available to a decoder that is observing the neural responses and who knows the true stimulus-response probabilities. We can also compare it with a lower bound, I k , of how much information could be decoded by a decoder that assumes some simpler structure of correlation taking into account only statistical correlations between neurons up to the k-th order [2, 3]. A principled way to construct such models from experimental data is to build the maximum-entropy response probability among those with the same marginal probabilities (and thus correlations) up to order kas the real population responses. We quantify the importance of higher order correlations as the decoding cost (I-I k ) of neglecting higher order correlation.
We demonstrate that, by using appropriate bias correction statistical techniques , this lower bound can be made data-robust and computed with the limited number of trials typically recorded in neurophysiology experiments. With 200 trials per stimulus, we could compute the contribution of information conveyed by all higher order correlations for a group of up to 10 neurons. Taken together, these results suggest that the application of the method proposed here will unravel the role of high order correlations among neurons in sensory coding, thus giving an insight into the complexity of the coding.
Thanks to M. A. Montemurro, for useful discussions. Supported by Pfizer Global Development RTD and the Royal Society.
- Averbeck BB, Latham PE, Pouget A: Neural correlations, population coding and computation. Nature Rev Neurosci. 2006, 7: 358-366. 10.1038/nrn1888.View ArticleGoogle Scholar
- Latham PE, Nirenberg S: Synergy, redundancy, and independence in population codes, revisited. J Neurosci. 2005, 25: 5195-5206. 10.1523/JNEUROSCI.5319-04.2005.PubMedView ArticleGoogle Scholar
- Amari SH: Information geometry on hierarchy of probability distributions. IEEE Trans Inform Theory. 2001, 47: 1701-1711. 10.1109/18.930911.View ArticleGoogle Scholar
- Montemurro MA, Senatore R, Panzeri S: Tight data-robust bounds to mutual information combining shuffling and model selection techniques. Neural Computation. 2007.Google Scholar
This article is published under license to BioMed Central Ltd.