Skip to main content
  • Poster presentation
  • Open access
  • Published:

Slow points and adiabatic fixed points in recurrent neural networks

The time scales for cognitive information processing in the brain range, at least, from milliseconds (the time scale of the action potential) to many seconds (the time scale of short-term memory), spanning several orders of magnitude. In this context the slow dynamical components can be regarded as background processes modulating adiabatically the parameters governing the fast neural activity. For the case of recurrent neural networks the slow processes then change adiabatically the attractor landscape, including the adiabatic fixed points, inducing possibly both second order bifurcations, with respect to the steady-state neural activity, and first order catastrophes.

In this contribution we investigate the slow adaption of the attractor landscape of exemplary small recurrent neural networks consisting of continuous-time point neurons [1]. The state of one of these integrate-and-fire neurons is fully determined by its membrane potential and two adapting internal parameters [2], the gain and the threshold, with the time scale of adaption 1/ε being substantially larger than the time scale of the primary neural activity. We point out that not only the adiabatic fixed points of the network are important for shaping the neural dynamics, but also the points in phase space where the flow slows down considerably (called slow points or attractor ruins [3]).

We rigorously examine the metadynamics of the attractor landscape for a three-neuron system, observing five different phases (see Fig. 1) for different values of the adaption rate, of which four can be distinguished by the number and stability of the adiabatic fixed points. Three of the observed transitions are of second order. The remaining transition to the phase of lowest adaption is of first order and shows hysteresis. This transition occurs, remarkably, at very low adaption rates of ε ~ 10-5 and is characterized by a higher-order catastrophe.

Figure 1
figure 1

Phase diagram of the three-neuron system showing five different phases distinguished by the stability and the shape of the adiabatic fixed points (AFP) for different values of the adaption rate ε. One can observe three second order transitions (dashed) and one first order transition showing hysteresis (striped area). The subplots show the firing rate (green), saddle node AFP (gray), stable AFP (blue) and effectively attracting AFP (red) over time and represent a phase each.

We conclude that even relatively simple recurrent networks may show highly non-trivial adapting attractor landscapes and that the study of the attractor metadynamics in the brain maybe important for a further understanding of decision processes and dynamical memory recall.

References

  1. Linkerhand M, Gros C: Generating functionals for autonomous latching dynamics in attractor relict networks. Sci Rep. 2013, 3-

    Google Scholar 

  2. Triesch J: A gradient rule for the plasticity of a neuron's intrinsic excitability. Proceedings of ICANN. 2005, Springer, 65-70.

    Google Scholar 

  3. Gros C, Linkerhand M, Walther V: Attractor Metadynamics in Adapting Neural Networks. ArXiv preprint. 2014, arXiv:1404.5417

    Google Scholar 

Download references

Acknowledgements

This work benefited from discussions with Bulcsú Sándor. The research was supported by funds of the DFG and Studienstiftung d. dt. Volkes.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hendrik Wernecke.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wernecke, H., Gros, C. Slow points and adiabatic fixed points in recurrent neural networks. BMC Neurosci 16 (Suppl 1), P88 (2015). https://doi.org/10.1186/1471-2202-16-S1-P88

Download citation

  • Published:

  • DOI: https://doi.org/10.1186/1471-2202-16-S1-P88

Keywords