Volume 16 Supplement 1

24th Annual Computational Neuroscience Meeting: CNS*2015

Open Access

Deterministic neural networks as sources of uncorrelated noise for probabilistic computations

  • Jakob Jordan1Email author,
  • Tom Tetzlaff1,
  • Mihai Petrovici2,
  • Oliver Breitwieser2,
  • Ilja Bytschok2,
  • Johannes Bill3,
  • Johannes Schemmel2,
  • Karlheinz Meier2 and
  • Markus Diesmann1
BMC Neuroscience201516(Suppl 1):P62

https://doi.org/10.1186/1471-2202-16-S1-P62

Published: 18 December 2015

Neural-network models of brain function often rely on the presence of noise [14]. To date, the interplay of microscopic noise sources and network function is only poorly understood. In computer simulations and in neuromorphic hardware [57], the number of noise sources (random-number generators) is limited. In consequence, neurons in large functional network models have to share noise sources and are therefore correlated. In general, it is unclear how shared-noise correlations affect the performance of functional network models. Further, there is so far no solution to the problem of how a limited number of noise sources can supply a large number of functional units with uncorrelated noise.

Here, we investigate the performance of neural Boltzmann machines [24]. We show that correlations in the background activity are detrimental to the sampling performance and that the deviations from the target distribution scale inversely with the number of noise sources. Further, we show that this problem can be overcome by replacing the finite ensemble of independent noise sources by a recurrent neural network with the same number of units. As shown recently, inhibitory feedback, abundant in biological neural networks, serves as a powerful decorrelation mechanism [8, 9]: Shared-noise correlations are actively suppressed by the network dynamics. By exploiting this effect, the network performance is significantly improved. Hence, recurrent neural networks can serve as natural finite-size noise sources for functional neural networks, both in biological and in synthetic neuromorphic substrates. Finally we investigate the impact of sampling network parameters on its ability to faithfully represent a given well-defined distribution. We show that sampling networks with sufficiently strong negative feedback can intrinsically suppress correlations in the background activity, and thereby improve their performance substantially.

Declarations

Acknowledgements

Partially supported by the Helmholtz Association portfolio theme SMHB, the Jülich Aachen Research Alliance (JARA), EU Grant 269921 (BrainScaleS), The Austrian Science Fund FWF #I753-N23 (PNEUMA), The Manfred Stärk Foundation, and EU Grant 604102 (Human Brain Project, HBP).

Authors’ Affiliations

(1)
Institute for Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6), Jülich Research Center and JARA
(2)
Kirchhoff Institute for Physics, Ruprecht-Karls-University Heidelberg
(3)
Institute for Theoretical Computer Science, Graz University of Technology

References

  1. Rolls ET, Deco G: The noisy brain. 2010, Oxford University PressGoogle Scholar
  2. Hinton GE, Sejnowski TJ, Ackley DH: Boltzmann machines: constraint satisfaction networks that learn. Technical report, Carnegie-Mellon University. 1984Google Scholar
  3. Buesing L, Bill J, Nessler B, Maass W: Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons. PloS CB. 2011, 7: e1002211-Google Scholar
  4. Petrovici MA, Bill J, Bytschok I, Schemmel J, Meier K: Stochastic inference with deterministic spiking neurons. 2013, arXiv 1311.3211v1 [q-bio.NC]Google Scholar
  5. Schemmel J, Bruederle D, Gruebl A, Hock M, Meier K, Millner S: AWafer-Scale Neuromorphic Hardware System for Large-Scale Neural Modeling. Proceedings of the 2010 International Symposium on Circuits and Systems (ISCAS), IEEE Press. 2010, 1947-1950.View ArticleGoogle Scholar
  6. Bruederle D, Petrovici M, Vogginger B, Ehrlich M, Pfeil T, Millner S, Gruebl A, Wendt K, Mueller E, Schwartz MO et al: A comprehensive workflow for general-purpose neural modeling with highly configurable neuromorphic hardware systems. Biological Cybernetics. 2011, 104: 263-296.View ArticleGoogle Scholar
  7. Petrovici MA, Vogginger B, Mueller P, Breitwieser O, Lundqvist M, Muller L, Ehrlich M, Destexhe A, Lansner A, Schueffny R, et al: Characterization and Compensation of Network-Level Anomalies in Mixed-Signal Neuromorphic Modeling Platforms. PLoS ONE. 2014, 9 (10): e108590-PubMedPubMed CentralView ArticleGoogle Scholar
  8. Renart A, De La Rocha J, Bartho P, Hollender L, Parga N, Reyes A, Harris KD: The asynchronous State in Cortical Circuits. Science. 2010, 327: 587-590.PubMedPubMed CentralView ArticleGoogle Scholar
  9. Tetzlaff T, Helias M, Einevoll G, Diesmann M: Decorrelation of neural-network activity by inhibitory feedback. PloS CB. 2012, 8: e1002596-Google Scholar

Copyright

© Tetzlaff et al. 2015

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Advertisement