- Poster presentation
- Open Access
Nonlinear variability measures for respiratory rhythm generation
© Alsharif and Fietkiewicz; licensee BioMed Central Ltd. 2014
- Published: 21 July 2014
- Mutual Information
- Information Measure
- Respiratory Control
- Periodic Pattern
- Surrogate Data
Variability in rhythmic physiological systems is of great interest, especially with regard to identifying deterministic variability. Algorithms have been proposed to quantify nonlinear variability, but the results can be difficult to interpret. Proper interpretation of traditional statistical methods depends on a knowledge of the underlying distribution (e.g. Gaussian). Measures of nonlinear variability must also be interpreted in the context of the particular characteristics of the system being analyzed .
Analysis methods are often first developed in the context of a particular application and then popularized as a general technique. For example, approximate entropy was developed to analyze the electrocardiogram [2, 3] and is now used extensively in other areas. Though various algorithms may be well understood, parameter selection is typically based on tradition . Additionally, proper interpretation requires the use of surrogate data analysis. However, surrogate data is often poorly understood by experimentalists with regard to the algorithms used and the data that is generated .
Information measures such as sample entropy and mutual information have recently been applied to the neural generation of respiratory rhythms . Here we investigate two aspects of applying information measures to periodic patterns such as those found in respiration. First, the use of surrogate data analysis is critical for interpretation of results from information measures. Here we evaluate its application to the testing of hypotheses in respiratory control. Second, information measures do not necessarily distinguish between stochastic and deterministic sources of variability in a system. Here we use both experimental and simulated data to study the relative effects of stochastic and deterministic sources of variability on sample entropy and mutual information algorithms.
- Kaffashi F, Foglyano R, Wilson CG, Loparo KA: The effect of time delay on Approximate & Sample Entropy calculations. Physica D. 2008, 237: 3069-3074. 10.1016/j.physd.2008.06.005.View ArticleGoogle Scholar
- Pincus SM: Approximate entropy as a measure of system complexity. Proc Natl Acad Sci U S A. 1991, 88 (6): 2297-301. 10.1073/pnas.88.6.2297.PubMed CentralView ArticlePubMedGoogle Scholar
- Pincus SM, Gladstone IM, Ehrenkranz RA: A regularity statistic for medical data analysis. J Clin Monit. 1991, 7 (4): 335-45. 10.1007/BF01619355.View ArticlePubMedGoogle Scholar
- Schreiber T, Schmitz A: Surrogate time series. Physica D. 2000, 142: 346-382. 10.1016/S0167-2789(00)00043-9.View ArticleGoogle Scholar
- Dhingra RR, Jacono FJ, Fishman M, Loparo KA, Rybak IA, Dick TE: Vagal-dependent nonlinear variability in the respiratory pattern of anesthetized, spontaneously breathing rats. J Appl Physiol. 1985, 111 (1): 272-84.View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.