Skip to main content
  • Poster presentation
  • Open access
  • Published:

Beyond dynamical mean-field theory of neural networks

We consider a set of N firing rate neurons with discrete time dynamics and a leak term γ. The nonlinearity of the sigmoid is controlled by a parameter g and each neuron has a firing threshold θ, Gaussian distributed (thresholds are uncorrelated). The network is fully connected with correlated Gaussian random synaptic weights, with mean zero and covariance matrix C/N. When synaptic weights are uncorrelated the dynamic mean field theory developed in [1–3] allows us to draw the bifurcation diagram of the model in the thermodynamic limit (N tending to infinity): in particular there is sharp transition from fixed point to chaos characterized by the maximum Lyapunov exponent, which is known analytically in the thermodynamic limit. The bifurcation diagram is drawn in Figure 1 A. However, mean-field theory is exact only in the thermodynamic limit and when synaptic weights are uncorrelated. What are the deviations from mean-field theory observed when one departs from these hypotheses? We have first studied the finite size dynamics. For finite N the maximal Lyapunov exponent has a plateau at 0 corresponding to a transition to chaos by quasi-periodicity where dynamics is at the edge of chaos (Figure 1 B). This plateau disappears in the thermodynamic limit. Thus, mean-field theory neglects an important finite-sized effect since neuronal dynamics at the edge of chaos has strong implications on learning performances of the network [4]. We also studied the effect of a weak correlation (of amplitude ε) on dynamics. Even, when ε is small one detects an important deviation on the maximal Lyapunov exponent (Figure 1 C).

Figure 1
figure 1

(A) Bifurcation map. 1 : one stable fixed point; 2 two stable fixed points; 3 one fixed point and one strange attractor; 4 one strange attractor. (B) Finite N and Mean-Field Maximal Lyapunov exponent (θ = 0.1, γ = 0). (C) Finite N Maximal Lyapunov exponent with weak correlation (ε = 0.01 and Mean-Field Maximal Lyapunov Exponent without correlation (ε = 0).

References

  1. Cessac B, Doyon B, Quoy M, Samuelides M: Mean-field equations, bifurcation map and route to chaos in discrete time neural networks. Physica D. 1994, 74: 24-44. 10.1016/0167-2789(94)90024-8.

    Article  Google Scholar 

  2. Cessac B: Increase in Complexity in Random Neural Networks. J Phys I France. 1995, 5: 409-432. 10.1051/jp1:1995135.

    Article  Google Scholar 

  3. Moynot O, Samuelides M: Large deviations and mean-field theory for asymmetric random recurrent neural networks. Probability Theory and Related Fields. 2002, 123: 41-75. 10.1007/s004400100182. Springer-Verlag

    Article  Google Scholar 

  4. Legenstein R, Maass W: Edge of Chaos and Prediction of Computational Performance for Neural Circuit Models. Neural Networks. 2007, 20: 323-334. 10.1016/j.neunet.2007.04.017.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

This work was supported by INRIA, ERC-NERVI number 227747, KEOPS ANR-CONICYT and European Union Project # FP7-269921 (BrainScales), Renvision grant agreement N 600847 and Mathemacs FP7-ICT_2011.9.7.

Author information

Authors and Affiliations

Authors

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Muratori, M., Cessac, B. Beyond dynamical mean-field theory of neural networks. BMC Neurosci 14 (Suppl 1), P60 (2013). https://doi.org/10.1186/1471-2202-14-S1-P60

Download citation

  • Published:

  • DOI: https://doi.org/10.1186/1471-2202-14-S1-P60

Keywords