Volume 10 Supplement 1

Eighteenth Annual Computational Neuroscience Meeting: CNS*2009

Open Access

Quantifying the complexity of neural network output using entropy measures

  • Ryan M Foglyano1Email author,
  • Farhad Kaffashi2,
  • Thomas E Dick3,
  • Kenneth A Loparo2 and
  • Christopher G Wilson1
BMC Neuroscience200910(Suppl 1):P322

DOI: 10.1186/1471-2202-10-S1-P322

Published: 13 July 2009


Countless methods exist to quantify neurophysiological signals yet determining the most appropriate tool for extracting features is a daunting task. A growing body of literature has investigated the use of entropy for measuring "complexity" in signals. We present the application of a suite of entropy measures on neural network outputs to compare and constrast their ability to identify signal characteristics not captured by variance based measures of regularity. Our previous work [1] has shown that modifications to exisiting algorithms may be necessary to accurately capture nonlinear signal components. We have built upon this work, revealed interesting features in a commonly used preparation and hypothesize our entropy tools as being useful for a wide variety of scientists.


We used the in vitro respiratory slice preparation from neonatal rats [2] and an in silico model (NEURON) of this system. In brief, a brainstem slice containing the preBötzinger complex, premotoneurons and XII motomeurons is surgically removed, placed in a chamber with artificial cerebrospinal fluid and electrophysiologically recorded from. This slice contains necessary and sufficient neural circuitry to generate spontaneous rhythmic activity. To test changes in network complexity, network excitability was altered by changing extracellular [K+].

Our entropy work focused on three measures: Approximate Entropy (ApEn), Sample Entropy (SampEn) and the Entropy of interburst intervals (EnInt). We consider larger entropy values to mean less predictability (ApEn and SampEn) or more information density (EnInt). ApEn and SampEn were calculated for the fictive respiratory "bursts" (in vitro) and EnInt was applied on the interburst intervals (in vitro and in silico).


Our in vitro entropy measures showed a significant change as network excitability was increased. The measures also identified peaks in complexity at 5–7 mM [K+]. These trends were not observed with linear measures. The in vitro peak complexity occurred at different levels for the timing component (EnInt) and the burst dynamics (ApEn and SampEn). We are currently incorporating these observations into our in silico model.


These results suggest that entropy measures offer the ability to quantify additional aspects of a neural signal. Specifically, changing excitablity (which is common) influences the complexity of the bursting patterns and may control a bifurcation point in in vitro network activity. These changes may provide further insight into respiratory instabilities in humans. We envision these tools (freely available from our laboratory) as useful for improving feature detection in neural networks and providing additional data dimension.

Authors’ Affiliations

Department of Pediatrics, Case Western Reserve University
Department of Electrical Engineering & Computer Science, Case Western Reserve University
Department of Medicine, Case Western Reserve University


  1. Kaffashi F, Foglyano R, Wilson CG, Loparo KA: The effect of time delay on approximate and sample entropy calculations. Physica D: Nonlinear Phenomena. 2008, 237: 3069-3074. 10.1016/j.physd.2008.06.005.View ArticleGoogle Scholar
  2. Smith JC, Ellenberger HH, Ballanyi K, Richter DW, Feldman JL: PreBötzinger complex: a brainstem region that may generate respiratory rhythm in mammals. Science. 1991, 254: 726-998. 10.1126/science.1683005.PubMed CentralPubMedView ArticleGoogle Scholar


© Foglyano et al; licensee BioMed Central Ltd. 2009

This article is published under license to BioMed Central Ltd.