Skip to content

Advertisement

  • Poster presentation
  • Open Access

Information-theoretic analysis of a dynamic release site using a two-channel model of depression

BMC Neuroscience201516 (Suppl 1) :P149

https://doi.org/10.1186/1471-2202-16-S1-P149

  • Published:

Keywords

  • Spike Train
  • Information Transfer
  • Release Site
  • Information Rate
  • Spike Rate

Synapses are dynamic communication channels between neurons as their rates of information transfer depend on past history. While information theory has been used to study the information efficacy of synapses [13], the effect of synaptic dynamics, including short-term depression and facilitation, on the information rate is not yet fully understood.

To reduce the complexity of the problem, we confine ourselves here to a single release site of the synapse. This allows us to analytically calculate the information transfer at the release site for a simple model of synaptic depression which is based on binary channels. The input of the model is a spike train, modeled by an independent identically distributed process X = { X i } i = 1 , where each X i has a Bernoulli distribution with P(X i = 0) = α. The model's output is a process Y = { Y i } i = 1 , such that if there is a release at time, then Y i = 1 and otherwise Y i = 0. We model the short term depression by two binary asymmetric channels that represent the possible states of the release site: the 'recovered' state, when no release occurred in the previous time step (Figure 1a), and the 'used' state, following vesicle release (Figure 1b). In particular, we assume that the release probability is reduced following a release, that is p2p1 and 1 - q2 ≤ 1 - q1.

Figure 1

Each individual channel in Figure 1 will have a mutual information rate, either r 1 or r 2 . As X i is Bernoulli-distributed, r i = h ( α p i + α ̄ q i ¯ ) - α h ( p i ) - α ̄ h ( q i ¯ ) for i = 1,2, where h(·) is the entropy of a Bernoulli random variable and x ̄ = 1 - x . We prove that the mutual information rate of the release site with depression is a linear summation of the information rates of these two channels. The mutual information rate I(X;Y) between the input process X nd the output process Y, is I(X;Y) = θr 1 + (1 - θ)r 2 where θ = α p 2 ¯ + α ̄ q 2 α ( p 1 + p 2 ¯ ) + α ( q 1 ¯ + q 2 )

The closed form expression of the mutual information rate allows us to study the effect of depression analytically. Through simulations we show that for a range of parameters, depression improves the rate of information transfer at the release site. We also show that when the level of depression is increased (i.e., with smaller p2 and larger q2), the release site's information capacity is reached at lower input spike rates. Therefore, the optimal spike rate of the presynaptic neuron has a reverse relationship with the depression level of its release site. This means that synaptic depression can save energy while maintaining information rate. The two-channel model of release site is a building block for the construction of more precise models of synaptic transmission. These advanced models will enable us to evaluate and study the synaptic information rates analytically.

Declarations

Acknowledgement

This work was supported by the BMBF grant 01EO1401 (German Center for Vertigo and Balance Disorders).

Authors’ Affiliations

(1)
Graduate School of Systemic Neurosciences, Ludwig-Maximilian University, Munich, Germany
(2)
German Center for Vertigo and Balance Disorders, Ludwig-Maximilian University, Munich, Germany
(3)
Bernstein Center for Computational Neuroscience, Munich, Germany
(4)
Department of Biology II, Ludwig-Maximilian University, Munich, Germany
(5)
Department of Neurology, Ludwig-Maximilian University, Munich, Germany

References

  1. London M, Schreibman A, Häusser M, Larkum ME, Segev I: The information efficacy of a synapse. Nature Neuroscience. 2002, 5 (4): 332-340.PubMedView ArticleGoogle Scholar
  2. Fuhrmann G, Segev I, Markram H, Tsodyks M: Coding of temporal information by activity-dependent synapses. Journal of Neurophysiology. 2002, 87 (1): 140-148.PubMedGoogle Scholar
  3. Goldman MS: Enhancement of information transmission efficiency by synaptic failures. Neural Computation. 2004, 16 (6): 1137-1162.PubMedView ArticleGoogle Scholar

Copyright

© Salmasi et al. 2015

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Advertisement