Skip to content

Advertisement

  • Poster presentation
  • Open Access

Symmetries constrain the transition to heterogeneous chaos in balanced networks

BMC Neuroscience201516 (Suppl 1) :P229

https://doi.org/10.1186/1471-2202-16-S1-P229

  • Published:

Keywords

  • Firing Rate
  • Network Connectivity
  • Computational Capability
  • Chaotic Regime
  • Balance Network

Biological neural circuits display both spontaneous asynchronous activity, and complex, yet ordered activity while actively responding to input. Recently, researchers have demonstrated how this spontaneous behavior underlies computational capabilities in large, recurrently connected networks of firing rate [1, 2] and spiking [3] units.

Yet, not all spontaneous activity is equal: complex computations may require the rich phase-space of heterogeneous chaos, in which each neuron has a different time-dependent firing rate [2, 3]. Here, we address the question of how network connectivity structure may affect the transition to heterogeneous chaos in echo-state networks.

We choose a family of firing-rate networks in which the neurobiological constraint of Dale's Law -- that most neurons are either excitatory or inhibitory -- is satisfied, and in which excitation and inhibition are balanced. We first study the transition to heterogeneous chaos in this setting, using principal component analysis (PCA) to provide a lower-dimensional description of network activity. We find that key characteristics of this transition differ in constrained networks, versus unconstrained networks with similar variability: the transition to heterogeneous chaos occurs at higher coupling strengths, and is variable across specific networks.

These properties are a consequence of the fact that the constrained system may be described as a perturbation from a system with non-trivial symmetries. These symmetries imply the presence of both fixed points and periodic orbits that are an organizing center for solutions, even for large perturbations. In comparison, spectral characteristics of the network coupling matrix [46] are relatively uninformative about the behavior of the constrained system.

We next investigated the impact of this structure on the computational capabilities of constrained vs. unconstrained networks. We first examine the response of networks to time-dependent inputs [3]. In comparison to unconstrained networks, constrained networks performed better on population coding of firing rate, but had less ability to separate two different time-dependent inputs in phase space.

In contrast, the delayed transition to chaos had little effect on the ability of constrained networks to reproduce a learning task recently investigated in unconstrained networks [7]. We inspected example networks and found that the addition of the feedback loop quickly moves effective network connectivity away from symmetry and into the chaotic regime at the onset of training.

Authors’ Affiliations

(1)
Department of Mathematics, Southern Methodist University, Dallas, TX, USA
(2)
Department of Applied Mathematics, University of Washington, Seattle, WA, USA

References

  1. Bertschinger N, Natschläger T: Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks. Neural Computation. 2004, 16 (7): 1413-1436.PubMedView ArticleGoogle Scholar
  2. Sussillo D, Abbott LF: Generating Coherent Patterns of Activity from Chaotic Neural Networks. Neuron. 2009, 63 (4): 544-557.PubMedPubMed CentralView ArticleGoogle Scholar
  3. Ostojic S: Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons. Nature Neuroscience. 2014, 17: 594-600.PubMedView ArticleGoogle Scholar
  4. Rajan K, Abbott LF: Eigenvalue Spectra of Random Matrices for Neural Networks. Physical Review Letters. 2006, 97 (18): 188104-PubMedView ArticleGoogle Scholar
  5. Wei Y: Eigenvalue spectra of asymmetric random matrices for multicomponent neural networks. Physical Review E. 2012, 85: 066116-View ArticleGoogle Scholar
  6. Aljadeff J, Stern M, Sharpee T: Chaos in heterogenous neural networks: I. The critical transition point. BMC Neuroscience. 2014, 15 (Suppl 1): O20-PubMed CentralView ArticleGoogle Scholar
  7. Sussillo D, Barak O: Opening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks. Neural Computation. 2013, 25 (3): 1-24.View ArticleGoogle Scholar

Copyright

Advertisement