- Poster presentation
- Open access
- Published:

# Information-theoretic analysis of a dynamic release site using a two-channel model of depression

*BMC Neuroscience*
**volumeÂ 16**, ArticleÂ number:Â P149 (2015)

Synapses are dynamic communication channels between neurons as their rates of information transfer depend on past history. While information theory has been used to study the information efficacy of synapses [1â€“3], the effect of synaptic dynamics, including short-term depression and facilitation, on the information rate is not yet fully understood.

To reduce the complexity of the problem, we confine ourselves here to a single release site of the synapse. This allows us to analytically calculate the information transfer at the release site for a simple model of synaptic depression which is based on binary channels. The input of the model is a spike train, modeled by an independent identically distributed process \mathsf{\text{X}}={\left\{{\mathsf{\text{X}}}_{\mathsf{\text{i}}}\right\}}_{\mathsf{\text{i}}=1}^{\mathrm{\xe2\u02c6\u017e}}, where each *X*_{
i
} has a Bernoulli distribution with *P*(*X*_{
i
} = 0) = Î±. The model's output is a process \mathsf{\text{Y}}={\left\{{\mathsf{\text{Y}}}_{\mathsf{\text{i}}}\right\}}_{\mathsf{\text{i}}=1}^{\mathrm{\xe2\u02c6\u017e}}, such that if there is a release at time, then *Y*_{
i
} = 1 and otherwise *Y*_{
i
} = 0. We model the short term depression by two binary asymmetric channels that represent the possible states of the release site: the 'recovered' state, when no release occurred in the previous time step (Figure 1a), and the 'used' state, following vesicle release (Figure 1b). In particular, we assume that the release probability is reduced following a release, that is *p*_{2} â‰¤ *p*_{1} and 1 - *q*_{2} â‰¤ 1 - *q*_{1}.

Each individual channel in Figure 1 will have a mutual information rate, either *r*_{
1
} or *r*_{
2
}. As *X*_{
i
} is Bernoulli-distributed, {\mathsf{\text{r}}}_{\mathsf{\text{i}}}=\mathsf{\text{h}}\left(\mathrm{\xce\pm}{\mathsf{\text{p}}}_{\mathsf{\text{i}}}+\stackrel{\xcc\u201e}{\mathrm{\xce\pm}}\stackrel{\xc2\xaf}{{\mathsf{\text{q}}}_{\mathsf{\text{i}}}}\right)-\mathrm{\xce\pm}\mathsf{\text{h}}\left({\mathsf{\text{p}}}_{\mathsf{\text{i}}}\right)-\stackrel{\xcc\u201e}{\mathrm{\xce\pm}}\mathsf{\text{h}}\left(\stackrel{\xc2\xaf}{{\mathsf{\text{q}}}_{\mathsf{\text{i}}}}\right) for *i* = 1,2, where *h*(Â·) is the entropy of a Bernoulli random variable and \stackrel{\xcc\u201e}{\mathsf{\text{x}}}=1-\mathsf{\text{x}}. We prove that the mutual information rate of the release site with depression is a linear summation of the information rates of these two channels. The mutual information rate *I(X;Y)* between the input process *X* nd the output process *Y*, is *I(X;Y) = Î¸r*_{
1
} *+ (1 - Î¸)r*_{
2
} where \mathrm{\xce\xb8}=\frac{\mathrm{\xce\pm}\stackrel{\xc2\xaf}{{\mathsf{\text{p}}}_{2}}+\stackrel{\xcc\u201e}{\mathrm{\xce\pm}}{\mathsf{\text{q}}}_{2}}{\mathrm{\xce\pm}\left({\mathsf{\text{p}}}_{1}+\stackrel{\xc2\xaf}{{\mathsf{\text{p}}}_{2}}\right)+\mathrm{\xce\pm}\left(\stackrel{\xc2\xaf}{{\mathsf{\text{q}}}_{1}}+{\mathsf{\text{q}}}_{2}\right)}

The closed form expression of the mutual information rate allows us to study the effect of depression analytically. Through simulations we show that for a range of parameters, depression improves the rate of information transfer at the release site. We also show that when the level of depression is increased (i.e., with smaller *p*_{2} and larger *q*_{2}), the release site's information capacity is reached at lower input spike rates. Therefore, the optimal spike rate of the presynaptic neuron has a reverse relationship with the depression level of its release site. This means that synaptic depression can save energy while maintaining information rate. The two-channel model of release site is a building block for the construction of more precise models of synaptic transmission. These advanced models will enable us to evaluate and study the synaptic information rates analytically.

## References

London M, Schreibman A, HÃ¤usser M, Larkum ME, Segev I: The information efficacy of a synapse. Nature Neuroscience. 2002, 5 (4): 332-340.

Fuhrmann G, Segev I, Markram H, Tsodyks M: Coding of temporal information by activity-dependent synapses. Journal of Neurophysiology. 2002, 87 (1): 140-148.

Goldman MS: Enhancement of information transmission efficiency by synaptic failures. Neural Computation. 2004, 16 (6): 1137-1162.

## Acknowledgement

This work was supported by the BMBF grant 01EO1401 (German Center for Vertigo and Balance Disorders).

## Author information

### Authors and Affiliations

### Corresponding author

## Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

## About this article

### Cite this article

Salmasi, M., Stemmler, M., Glasauer, S. *et al.* Information-theoretic analysis of a dynamic release site using a two-channel model of depression.
*BMC Neurosci* **16**
(Suppl 1), P149 (2015). https://doi.org/10.1186/1471-2202-16-S1-P149

Published:

DOI: https://doi.org/10.1186/1471-2202-16-S1-P149