Skip to content

Advertisement

  • Poster presentation
  • Open Access

Mutual information density of stochastic integrate-and-fire models

BMC Neuroscience201314 (Suppl 1) :P245

https://doi.org/10.1186/1471-2202-14-S1-P245

  • Published:

Keywords

  • Mutual Information
  • Numerical Procedure
  • Full Information
  • Information Transfer
  • Neuronal Response

The coherence function of integrate-and-fire neurons shows low-pass properties in the most diverse firing regimes [1]. While the coherence function provides a good approximation to the full information transfer properties in the case of a weak input, for a strong input non-linear encoding could play an important role. The complete information transfer is quantified by Shannon's mutual information rate [2] which has been estimated in certain biological model systems [3]. In general, the exact analytical calculation of the mutual information rate is unfeasible and even the numerical estimation is demanding [4].

Numerical calculation of the mutual information rate is now a commonly adopted practice, but it does not indicate what aspects of the stimulus are best represented by the neuronal response. We developed a numerical procedure to directly calculate a frequency-selective version of the mutual information rate. This can be used to study how different frequency components of a Gaussian stimulus are encoded in neural models without invoking a weak-signal paradigm.

Declarations

Acknowledgements

This work was funded by the BMBF (FKZ: 01GQ1001A).

Authors’ Affiliations

(1)
Bernstein Center for Computational Neuroscience, Berlin, 10115, Germany
(2)
Department of Physics, Freie Universität Berlin, Berlin, Berlin, 14195, Germany
(3)
Department of Physics, Humboldt-Universität zu Berlin, Berlin, Berlin, 12489, Germany

References

  1. Vilela RD, Lindner B: A comparative study of different integrate fire neurons: spontaneous activity, dynamical response, and stimulus-induced correlation. Phys Rev E. 2009, 80: 031909-View ArticleGoogle Scholar
  2. Shannon C: A Mathematical Theory of Communication. The Bell System Technical Journal. 1948, 27: 379-423. 623-656View ArticleGoogle Scholar
  3. Strong SP, Koberle R, de Ruyter van Steveninck R, Bialek W: Entropy and Information in Neural Spike Trains. Phys Rev Lett. 1998, 80 (1): 197-200. 10.1103/PhysRevLett.80.197.View ArticleGoogle Scholar
  4. Panzeri S, Senatore R, Montemurro MA, Petersen RS: Correcting for the sampling bias problem in spike train information measures. J Neurophysiol. 2007, 98 (3): 1064-1072. 10.1152/jn.00559.2007.View ArticlePubMedGoogle Scholar

Copyright

Advertisement