Volume 16 Supplement 1

## 24th Annual Computational Neuroscience Meeting: CNS*2015

- Poster presentation
- Open Access

# Information-theoretic analysis of a dynamic release site using a two-channel model of depression

- Mehrdad Salmasi
^{1, 2}Email author, - Martin Stemmler
^{3, 4}, - Stefan Glasauer
^{1, 2, 3, 5}and - Alex Loebel
^{3, 4}

**16 (Suppl 1)**:P149

https://doi.org/10.1186/1471-2202-16-S1-P149

© Salmasi et al. 2015

**Published:**18 December 2015

## Keywords

- Spike Train
- Information Transfer
- Release Site
- Information Rate
- Spike Rate

Synapses are dynamic communication channels between neurons as their rates of information transfer depend on past history. While information theory has been used to study the information efficacy of synapses [1–3], the effect of synaptic dynamics, including short-term depression and facilitation, on the information rate is not yet fully understood.

*X*

_{ i }has a Bernoulli distribution with

*P*(

*X*

_{ i }= 0) = α. The model's output is a process $\mathsf{\text{Y}}={\left\{{\mathsf{\text{Y}}}_{\mathsf{\text{i}}}\right\}}_{\mathsf{\text{i}}=1}^{\infty}$, such that if there is a release at time, then

*Y*

_{ i }= 1 and otherwise

*Y*

_{ i }= 0. We model the short term depression by two binary asymmetric channels that represent the possible states of the release site: the 'recovered' state, when no release occurred in the previous time step (Figure 1a), and the 'used' state, following vesicle release (Figure 1b). In particular, we assume that the release probability is reduced following a release, that is

*p*

_{2}≤

*p*

_{1}and 1 -

*q*

_{2}≤ 1 -

*q*

_{1}.

Each individual channel in Figure 1 will have a mutual information rate, either *r*_{
1
} or *r*_{
2
}. As *X*_{
i
} is Bernoulli-distributed, ${\mathsf{\text{r}}}_{\mathsf{\text{i}}}=\mathsf{\text{h}}\left(\alpha {\mathsf{\text{p}}}_{\mathsf{\text{i}}}+\stackrel{\u0304}{\alpha}\overline{{\mathsf{\text{q}}}_{\mathsf{\text{i}}}}\right)-\alpha \mathsf{\text{h}}\left({\mathsf{\text{p}}}_{\mathsf{\text{i}}}\right)-\stackrel{\u0304}{\alpha}\mathsf{\text{h}}\left(\overline{{\mathsf{\text{q}}}_{\mathsf{\text{i}}}}\right)$ for *i* = 1,2, where *h*(·) is the entropy of a Bernoulli random variable and $\stackrel{\u0304}{\mathsf{\text{x}}}=1-\mathsf{\text{x}}$. We prove that the mutual information rate of the release site with depression is a linear summation of the information rates of these two channels. The mutual information rate *I(X;Y)* between the input process *X* nd the output process *Y*, is *I(X;Y) = θr*_{
1
} *+ (1 - θ)r*_{
2
} where $\theta =\frac{\alpha \overline{{\mathsf{\text{p}}}_{2}}+\stackrel{\u0304}{\alpha}{\mathsf{\text{q}}}_{2}}{\alpha \left({\mathsf{\text{p}}}_{1}+\overline{{\mathsf{\text{p}}}_{2}}\right)+\alpha \left(\overline{{\mathsf{\text{q}}}_{1}}+{\mathsf{\text{q}}}_{2}\right)}$

The closed form expression of the mutual information rate allows us to study the effect of depression analytically. Through simulations we show that for a range of parameters, depression improves the rate of information transfer at the release site. We also show that when the level of depression is increased (i.e., with smaller *p*_{2} and larger *q*_{2}), the release site's information capacity is reached at lower input spike rates. Therefore, the optimal spike rate of the presynaptic neuron has a reverse relationship with the depression level of its release site. This means that synaptic depression can save energy while maintaining information rate. The two-channel model of release site is a building block for the construction of more precise models of synaptic transmission. These advanced models will enable us to evaluate and study the synaptic information rates analytically.

## Declarations

### Acknowledgement

This work was supported by the BMBF grant 01EO1401 (German Center for Vertigo and Balance Disorders).

## Authors’ Affiliations

## References

- London M, Schreibman A, Häusser M, Larkum ME, Segev I: The information efficacy of a synapse. Nature Neuroscience. 2002, 5 (4): 332-340.PubMedView ArticleGoogle Scholar
- Fuhrmann G, Segev I, Markram H, Tsodyks M: Coding of temporal information by activity-dependent synapses. Journal of Neurophysiology. 2002, 87 (1): 140-148.PubMedGoogle Scholar
- Goldman MS: Enhancement of information transmission efficiency by synaptic failures. Neural Computation. 2004, 16 (6): 1137-1162.PubMedView ArticleGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.